var/home/core/zuul-output/0000755000175000017500000000000015134160651014527 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015134166460015477 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000146370015134166411020263 0ustar corecore pikubelet.log_o[;r)Br'o b-n(!9t%Cs7}g/غIs,r.k9GfD P|6b}Wߟ/nm͊wqɻlOzN_ES??xI[mEy},fۮWe~7Nû/wb~1;ZxsY~ݳ( 2[$7۫j{Zw鶾z?&~|XLXlN_/:oXx$%X"LADA@@tkޕf{5Wbx=@^J})K3x~JkwI|YowS˷j̶֛]/8 N Rm(of`\r\L>{Jm 0{vR̍>dQQ.aLk~g\UlxDJfw6xi1U2 c#FD?2SgafO3|,ejoLR3[ D HJP1Ub2i]$HU^L_cZ_:F9TJJ{,mvgL;: ԓ$a;ɾ7lַ;̵3](uX|&kΆ2fb4NvS)f$UX dcю)""û5h< #чOɁ^˺b}0w8_jiB8.^s?Hs,&,#zd4XBu!.F"`a"BD) ᧁQZ-D\h]Q!]:zR{܃ lM6_OފߍO1nԝG?ƥF%QV5pDVHwԡ/.2h{qۀK8yUOdssdMvw`21ɻ]/ƛ"@8(PN_,_0;o_x+Vy<h\dN9:bġ7 -Pwȹl;M@n̞Qj_P\ Q]GcPN;e7Vtś98m1<:|a+.:a4nՒ,]LF0);I$>ga5"f[B[fhT/ɾgm\Sj#3hEEH*Nf äE@O0~y[쾋t=iYhșC 5ܩa!ǛfGtzz*з 55E9Fa?Zk80ݞN|:AОNo;Ⱦzu\0Ac/T%;m ~S`#u.Џ1qNp&gK60nqtƅ": C@!P q]G0,d%1}Uhs;H?)M"뛲@.Cs*H _0:P.BvJ>mIyVVTF% tFL-*$tZm2AČAE9ϯ~ihFf&6,֗&̴+s~x?53!}~Z[F)RH?uvͪ _5l *7h?cF_]CNnW)F5d,0SSNK9ް4:ÒozsB<^+鄌4:B%cXhK I}!5 YM%o<>"ہ)Za@Ι}YJz{ɛr|hxY/O$Zøu32EʉD'MS1}t i:Y`cФIX0$lη˽`!i:ګPSPٔ3@5;ȕ}PkڪH9' |":", 1Ҫ8 %lg&:2JC!Mjܽ#`PJWP4Q2:IGӸۡshN+60#:mufe߿~Y,iǑ wVq*T+ w%fx6 %u̩1hӰc%AYW ZY~a_6_yWf`rVA,f=A}h&V<UKkZ{iqi :íy˧FR1u)X9 f΁U ~5batx|ELU:T'T[G*ݧ ؽZK̡O6rLmȰ (T$ n#b@hpj:˾ojs)M/8`$:) X+ҧSaۥzw}^P1J%+P:Dsƫ%z; +g 0հc0E) 3jƯ?e|miȄ;嶑, }t&&\5u17\I@ 5O? ʴ(aPqPϟ'|!p+,ICE^fu `|M3J#BQȌ6DNnCˣ"F$/Qx%m&FK_7P|٢?I-RiAKoQrMI>QQ!'7h,sF\jzP\7:Q\)#s{p'ɂN$r;fVkv߸>6!<̅:xn<# -BȢ1I~ŋ-*|`В~_>ۅm}67X9z=Oa Am]fnޤ{"hd߃Ԉ|tLD3 7'yOc& LFs%B!sRE2K0p\0͙npV)̍F$X8a-bp)5,] Bo|ؖA]Y`78NBx;c4<ニ)H .Pd^cR^p_G+E--ۥ_F]a|v@|3p%kzh|k*BBRib\J3Yn|뇱[FfP%M:<`pz?]6laz5`ZQs{>3ư_o%oU׆]YLz_s߭AF'is^_&uUm$[[5HI4QCZ5!N&D[uiXk&2Bg&Ս7_/6v_cd쿽d@eU XyX2z>g8:.⺻h()&nO5YE\1t7aSyFxPV19 ĕi%K"IcB j>Pm[E[^oHmmU̸nG pHKZ{{Qo}i¿Xc\]e1e,5`te.5Hhao<[50wMUF􀍠PV?Yg"ź)\3mf|ܔMUiU|Ym! #'ukMmQ9Blm]TO1ba.XW x6ܠ9[v35H;-]Um4mMrW-k#~fؤϋu_j*^Wj^qM `-Pk.@%=X#|ۡb1lKcj$׋bKv[~"N jS4HOkeF3LPyi︅iWk! cAnxu6<7cp?WN $?X3l(?  'Z! ,Z.maO_Bk/m~ޖ(<qRfR"Au\PmLZ"twpuJ` mvf+T!6Ѓjw1ncuwo':o gSPC=]U҅yY9 &K<-na'Xk,P4+`Þ/lX/bjFO.= w ?>ȑ3n߿z,t s5Z/ Clo-` z?a~b mzkC zFȏ>1k*Dls6vP9hS  ehC.3 @6ijvUuBY hBnb[ Fr#D7ćlA!:X lYE>#0JvʈɌ|\u,'Y˲.,;oOwoj-25Hݻ7 li0bSlbw=IsxhRbd+I]Y]JP}@.供SЃ??w w@KvKts[TSa /ZaDžPAEư07>~w3n:U/.P珀Yaٳ5Ʈ]խ4 ~fh.8C>n@T%W?%TbzK-6cb:XeGL`'žeVVޖ~;BLv[n|viPjbMeO?!hEfޮ])4 ?KN1o<]0Bg9lldXuT ʑ!Iu2ʌnB5*<^I^~G;Ja߄bHȌsK+D"̽E/"Icƀsu0,gy(&TI{ U܋N5 l͖h"褁lm *#n/Q!m b0X3i)\IN˭% Y&cKoG w 9pM^WϋQf7s#bd+SDL ,FZ<1Kx&C!{P|Ռr,* ] O;*X]Eg,5,ouZm8pnglVj!p2֬uT[QyB402|2d5K: `Bcz|Rxxl3{c` 1nhJzQHv?hbºܞz=73qSO0}Dc D]ͺjgw07'㤸z YJ\Hb9Ɖ„2Hi{(2HFE?*w*hy4ޙM^٫wF(p]EwQzr*! 5F XrO7E[!gJ^.a&HߣaaQÝ$_vyz4}0!yܒ栒޹a% Ŋ X!cJ!A\ ?E\R1 q/rJjd A4y4c+bQ̘TT!kw/nb͵FcRG0xeO sw5TV12R7<OG5cjShGg/5TbW > ]~Wޠ9dNiee$V[\[Qp-&u~a+3~;xUFFW>'ǣC~방u)т48ZdH;j a]`bGԹ#qiP(yڤ~dO@wA[Vz/$NW\F?H4kX6)F*1*(eJAaݡ krqB}q^fn 8y7P  GRޠkQn>eqQntq"Occ°NRjg#qSn02DŔw:ؽ 5l)Fa/TTmCԤ{"9b{ywSXE*m#3U ùRIvޏrJ`k|wJKH:O*OKy`( ݢe*{ ua ȻݔhvOkU~OǠI/aǕ-JMX _.6KsjA Qsmd  O#F.Uf28ZAgy>y,d$C?v01q5e.Um>]RLa&r?+@6k&#l)I5_> ` D s5npo}/ؙq #a2V?X~.4O/'|/_|&q̑0dd4>vk 60D _o~[Sw3ckpkpLNa ^j 5*<&}kˢmqvۗj=<Tr=[ a^؃ È(<^=xZb [_tܡ&yЋ{ Sym^?̑sU~' Ԓ f\itu)b>5X -$sn.wMm[eG`̵E$uLrk-$_{$# $B*hN/ٟ#^jJ=䄸-m!AdEږG)շj#v;#y/hbv BO Iߒ {I7!UՆGIl HƗbd#HAF:iI }+2kK:Sov3b:1)'A6@\2X#Ih9M6rN+LxE>^DݮEڬTk1+trǴ5RHİ{qJ\}X` >+%ni3+(0m8HЭ*zAep!*)jxG:Up~gfu#x~ .2ןGRLIۘT==!TlN3ӆv%#oV}N~ˊc,_,=COU C],Ϣa!L}sy}u\0U'&2ihbvz=.ӟk ez\ƚO; -%M>AzzGvݑT58ry\wW|~3Ԟ_f&OC"msht: rF<SYi&It1!ʐDN q$0Y&Hv]9Zq=N1/u&%].]y#z18m@n1YHR=53hHT( Q(e@-#!'^AK$wTg1!H$|HBTf̋ Y@Mwq[Fī h[W,Ê=j8&d ԋU.I{7O=%iG|xqBչ̋@1+^.r%V12, _&/j"2@+ wm 4\xNtˆ;1ditQyc,m+-!sFɸv'IJ-tH{ "KFnLRH+H6Er$igsϦ>QKwҰ]Mfj8dqV+"/fC Q`B 6כy^SL[bJgW^;zA6hrH#< 1= F8) 򃟤,ŏd7>WKĉ~b2KQdk6՛tgYͼ#$eooԦ=#&d.09DHN>AK|s:.HDŽ">#%zNEt"tLvfkB|rN`)81 &ӭsēj\4iO,H̎<ߥ諵z/f]v2 0t[U;;+8&b=zwɓJ``FiQg9XʐoHKFϗ;gQZg܉?^_ XC.l.;oX]}:>3K0R|WD\hnZm֏op};ԫ^(fL}0/E>ƥN7OQ.8[ʔh,Rt:p<0-ʁקiߟt[A3)i>3Z i򩸉*ΏlA" &:1;O]-wgϊ)hn&i'v"/ͤqr@8!̴G~7u5/>HB)iYBAXKL =Z@ >lN%hwiiUsIA8Y&=*2 5I bHb3Lh!ޒh7YJt*CyJÄFKKùMt}.l^]El>NK|//f&!B {&g\,}F)L b߀My6Õw7[{Gqzfz3_X !xJ8T<2!)^_ďǂ.\-d)Kl1헐Z1WMʜ5$)M1Lʳsw5ǫR^v|t$VȖA+Lܑ,҂+sM/ѭy)_ÕNvc*@k]ן;trȫpeoxӻo_nfz6ؘҊ?b*bj^Tc?m%3-$h`EbDC;.j0X1dR? ^}Ծե4NI ܓR{Omu/~+^K9>lIxpI"wS S 'MV+Z:H2d,P4J8 L72?og1>b$]ObsKx̊y`bE&>XYs䀚EƂ@K?n>lhTm' nܡvO+0fqf٠r,$/Zt-1-dė}2Or@3?]^ʧM <mBɃkQ }^an.Fg86}I h5&XӘ8,>b _ z>9!Z>gUŞ}xTL̵ F8ՅX/!gqwߑZȖF 3U>gCCY Hsc`% s8,A_R$קQM17h\EL#w@>omJ/ŵ_iݼGw eIJipFrO{uqy/]c 2ėi_e}L~5&lҬt񗽐0/λL[H* JzeMlTr &|R 2ӗh$cdk?vy̦7]Ạ8ph?z]W_MqKJ> QA^"nYG0_8`N 7{Puٽ/}3ymGqF8RŔ.MMWrO»HzC7ݴLLƓxxi2mW4*@`tF)Ċ+@@ts@p_uM Wi·T"^'~i6֬:v~m!U5:ZoXZlvwohbclwůd >04)6fbMPTi-"] OQO*(qco:}*CL™5,V:;[g./IBfJ9u+Q·B"Q F>wTLAtUG>)d_ 8rcM6MY 6seAU9c>X؛f~TTX):UiM2r:2{q:OڪR7s )B: ۊZlz6LHY g<cmN:85Qt0E_fNTU*K&+/5q0);F74~'*z[\M-~vg&?  p%nNS)`0'ϠNa%t"8U [tJf;z^>%YdE6c>Ql~J6J#`~#Eh3ŕS,|HVh7m]Q!ӥSVL l)vJ |0M>8l WIf|\8U*hг˙r,3l'^  [}r?}W3Q#vS}ll>ŰAVG YңK\rnɪq(u$Bk|TUН}h!8l?= S7!8bdEuK<^F hxndSD,Xt]1Gm*h%)(=XUra^&6"BzƾH( ."u>.,SzbQ!g:l0r$ضz]'.!-r"1MCMu(KP|еcLqr/Hi%(azry !5yY~ :t>g:Y #)#vǷ!BY)Hc(<|q@OIW^9oJbN;sö!`hDZod@JLw9|fb>ѺRce wy\Mڗ鸳A+A.A~&'f2l*q0?âZEqrOt \5Ǩ >tf@Զ~[ن@a0(.6%D1Vga>@'@—>9VAze"l |d;)=lv$5|vg>{=u^K+Ȫcv/w#4RvX :锉vgqcU!}xF^jc5?7Ua,X nJV/!; 瓠 qgzkX=>NSAWE92' u`Y1%rXgs+"sc9| ]>TS"JرWBΌ-zҦՅv;3O'CX}+20G@U#=7) ^EUBuY}rçң~0Ac`u0RljRL9Ug6rErtQg\0P8_)Y]g5>.1C( .Kyw3ޯ$Tί :7u6o;[mBPIF5>H;I.w7ߍ|+qUߤ^oåyx^TE.7]u+wZTAjSlQBB/EU^\}Z=k-nn Q7WyTz|nˇ _qˍ[!;n ^b k[);ng]ȶM_u)O_x6Ҧ۾sp &el,\S Eb uw=&?ul_$,8'`*>q/E :Xd,RLW"Qd9JogT\1f3@KuJ'@B x,kA k ^d kYj5Ah1T9!(*t 0'b@񲱥-kc6V'Ó5huՂUMpa.% qZBh]Q; Gd:|ؐ3$ "6meO>Y?HELkY<ZP>8YAC| w#Dr. "h l`2@K$`#XtJ^ zDpC65]K[r0Z;`^ʁ-G$\~&Q;e[Od  ^g0uE~ۊ$q9`]T#CJ1Ǐ9?M8]o2seXVt=e!`JU#y8@*kI0{G\ 2v[{!fRБBmLaCfKywdgb񾍠z}(.>LC,HI~'.ObKjoJdO UDp*cj|>z G` |]}4:nq!`{ qBPu(DihU9P!`NHɩ݉S-^pşCx$BBRJ@ѥuȑz.#&UݠmF̤@U' M6MY0/r: *s5x{gsə$ԙy=Ejl1#XX۾;R;+[$4pj2褴ƙb<;dM-}T\%.}Ta@ iS*XK!\+ xJpΕ`p~mΖ/K 8Wu១btM4cxb(9A `aRs ^d6GXA1?V_mW puȇ;s`uih F2% [U۴"qkjXS~vWP~ RTMrQίE|S`poSOfb|RK> 1Gȩ f,( =X?D5Ir񾅶X/SR?|a|>lX9t>sgv"4z,rKrNS)`Őr\+!!:ViI[/ѩ آSNEYdE1LӰWlQ&Z!hoҒ"HgKX 6 -{z{ҌlNeb]nltwfCEI"*S k`u>8Rj.2hgWsE.,uDΡ1R:Vݐ/CBc˾] shGI 0Os\l|`pΕ q-ˠ{G\ QuaBn|L@dr_Wec>IdA Od[jlF=XJ|&+-T1m8NP٤KX1tr:lDWx"8@*4*X FQG<^^7 vq&EwQű&؁6y\QbRN s>U^Omlz3;QϢ*b:}9VIVH|08W12e_ʿd{xlzUܝlNDU j>zƖݗ!0 hDԘFLUb.u6dX)"]lj.b:|XU O\_JK\?}2:uGL.xllT_oqqq$p;Ndck[ Rh6T#0d5`+A>ΰ-8sѹ V)L>$ćYIu]dsEGزM+ycF;b$d:z4 \NR#cDB/eW< gb`[c1髰?(o$[e(lȣ'3K?vAq&*RZM}pnse~ Fzq*8vz;j]ln2e UZAZ/+QD4ͺ|@lX)ĕ̽=T/se+ϙK$S`hnOcE(T#|\*& >Vv:V^4R7{{u%[^3{GGL']9~yL :!czbv}2.R`Hz)OfJl˪VWF fK6%OE-km`l[ۯc_m_ ޱqcoK{{[#U jt*h2Vߓ`JBNcFSUȿ!:WѥLf2HHh1>0ւcz/AР]I,Ѥ1HtT=cKGX iצ) e5 'Cj45<Ui/L6$d` Q!NϺ/joȀ*Eg.6_~Oz_Z p&>1OU᧯%+|t><,BLqՙcoY5Mi^b``Q#$[+G~HC+ipӘ͏O +P?|/p|v1f2춓{o$Rޕ$E%xšF;`nJ+> 8`LOtG3y9yR S,."ev'}arAP% f 2ݦ&ouJ7`MI؟a~dQk|o5|ݷP-o+"5 ;z8y#C-,vdoPLÏb!3]bn2r 7fڀ1o }aPQ5$+ZՄzvȻ3q|9dXֻ|]\ݡ&CZ 1"c MX3O]dO)g ^{KC9" <ӏi7s€oI?oͿY,!]1>8-c|:c:ף[|(` 45R]8NӕIaB^AWO,g"Q@&a)i5OTŸaCkY"kG>1"\B,Fh3[7-v\fPf"Qh4Y4N[? NcדZ~1PX.8 i< DڀL˛9ȒGHz<|l/14mŚVkP~EE_P5= Džs|<:m~~F@8p1|c $ i.vy]qBvrv gb((;^",۵@.L CܝOg8ki:kX=Ki"ƒv~M>(Xټ*[L8~l?f88۹)>^>|qA9J5`_gӅqXtc^B~8;n "iJs+OoxЂOsql.!Ӗ99ǹ8d"( ;aHOx R!?Kye (ne~=Uu|gw-fLHQV 2Xh)rc-!K,Mmn0a]B'pv@m!R@Xuz7 ,iEkia}2^6hBĈ缺Ɋ)%ahdl2ۜ)Z>y=MRI,x/1e^4"#u\"w22c:FC2Q#V|Hv RA>eQMj~ Fq>7bp8i? ﶁ~oQԚ\=︉[x=^0`2>dYD-2]qX~4yox~3* h%(ݽ:XuBv~늦1:Hy3Sl_6 "f)|@]?tBߔჍS놧ꬢ8B,n$ߞbX4_' E*N8@StN d.Bc;f 4\1`"h|)1p'd#ԋK(4^ DG)XĩW|!Ni8ݶwSy2c˔ǁrz úM (y°gK?, oVm"#llOЖ8촱ѸJZXőfN8FgbZ&lyC:$(Hi.a܃JqQ,c61:aj=g]c${˧_ߏ42&>nwux3NKqy4b 8UGYzduɫuDRT5.߬%kDT8Qej7 8@% 7o[ ZJ^4'rM 3ePS3xfysD>\uTmyQ}5C_@Pn"cML ϲԄJC$RZ֜)ֳ_ose>&pm@rTH$\Jdȴ:J$'4/+.bS%iX,WeeP kTIcI[ΖXoOADea(RDוF3xqHNc[ $ IV}U3>G֌J?E6ۤnC4DNc%ctiD^</,=j˛IZBj$+;,a*7@+[h$:#mnOki`UG[fYhP`|tkU C!j:v>7V `F$콈`<}r3sӚ8llTbⲽ卯, 1OÈV֢pgnA֕ Wrn!μ0]^4VYe@݉t?ʈ"_UO&,b^"v[$k7'Ɉ# ٛv}jMlس={7'6b7x=hP=#@lbːB`>{ ܽN]ߓُiA1F/v`ۓ1ͽ `Ikڤ(U>6d,@qo;q}oGE܇Ag˜ Z F>; x{0(l~тٓIeoW7ޣ!~8k<^ $>93xzچ%%ٻ#5}a Xȏ8ltʼnLE>p07kX7u@&r" mqR]|iQM5#.b}EbN%.w7JcFC,39f~T8SyMKl0<8+ $`!M5 @( @ll=;7Z9(CLqSF4D./\'<);jDhA$>AHЋky=Nk801W_̌ǫ]ij{=Xuٜ⏧bN cF&߃>i*wGK@, F4rG>@A?})q?VJ.tRPux^[=^=ߘy}_ 5v$@zqV#YKE0YJZXO|r u%@ NNYX8ЉL'\C AOb1XvwVoBuLЗ#f^t?o_4.4NlGyANr0fC~OÙOϋ8AM2i&(e_2 1x+ZBFgP uZҞ'! L+k/a;y 5 fGwzlיX%I u^!_gIS_翲ʁzJUЅjNr)RbO#yEٞlbQ-vlG0\.K`g\E,hG8C4˲D8XY·7ICE@t$ kd YO2f&uJ5(=[Vd_jv5 =ŭk vypw1rHAxPֽJݍk½ kS X?͝ǫl*:M֦"H"h` Z |`>1yOfiQ~‘"C b~=`sK/qΗon7 ƤM!#ٴ(o. C K>-"ΚOoGq݉{s8_-_<)nЧ!9ev'w;Z.6Uo5KX,q1F:Ǐ$K_pxgvbf^SOʫMr3f 82s ey]@ rXK (_aڮkG3fģ͵f(lցK6v;jو+-v=ͦpMЌjƳ[a58,gvRrc{Oh|`ݡe*ɪIјE,vU%WY53>R q6F}~c#''J(p  %kgQB-B0Ȇ%vYwU6gVMY(-0hL7m\- 6 ΢0W.MזEu#7Vi{miۋַJ;!Y4r6^\P$iQ4z 9*c7 e?ueJAQG4+QDCȇb^r c(m֮^'wȇ+PSi WwxiƗi:DX Kޮ{Գ]gghHs6;%m X5;b l yU %P&nA(ݞPr CprWE8# 7X.HrYa"Gϡ$+1'CvG64@JǝIPw` ,hq#D>F(@>\Xa7hj/EPb+4n/\QghS9 6>v!1peD͚~vSf ~"7N]G{nfVDF"t #6[\ Oj>ՏNҳL@Gqo+FY 辥|?ˀ9J(`c{Wn7 h[qc5ÞY/6Q|=>OaNA]Ix e{8=e^c]0x#?~O;8m%5GUl2m>J"3V4I"XҁCʫe|iRnVP-|F_2ve&YbW 3w tVg P‡lI #ddd #X: [1K f`!?q'v\#S*o2 r$ YfB Z!*%oHti6J6*KJ-](/ƓWIka]αeAzXL'ZXB)/I\L. K#\t= OWhﴽE{opu\g1f̚ wk8Sb:/f;Oa.+4{sDYzN ,.tp׸Jс,{ (dӁ5G{l,+󬊠8Abz_;p! 3{ǝΆ%rvmX. wg7O4#s{3v~4ze}ǚBp(fU{|EjB- ݎ_%*P]-P& ǕEOIQqr(2<ɛB_U{M'hx}vzɏ_l9$JZTK?L[)J@R~# v~,v8  !I~ 5'vIl̻%@l;j!ܼw:\P:[ffSh,Œ`t! #3(K(N.HD\^Oy#SQ%AQ8UHbuz YJbZ!d-d̂͢ŦPшÎd6)Z@BX:z]7 'Il񲟬߲' bUIDV=mfYc`TÈ:ѫ!n;\4UBinXYhoSBˠms# D|hIN+Ã5-[=)56愶#t1=ۮvgWBƨր܂]F/fps -%)X4$5H"L(89ؓ){磂 * ĂB0bo>|7xω =kUu16B@ö{ԩ|b+xi21aAI `-]!!\!Ѹ{o!؃()A -,WX869D {V`Z9ɼ-|&V A0Bd`CVZ } " aMn9ۄb؛fxl.c &yRC1' &a}I;IWm K[~w<>S"kV2L૤?s7$9nX]=[ CX.p@M<Nc$]$Qy9btR^;05e[ .Y%ؒ#'8y@.TzdkCt{LSbyZd8^shh.raǭproaNq_w˧Rc"&Ua^7/bJ1!e9Qjb+٬ǞΕ,l7CP[6NKGJFC0N҅nm';l0d62csEE#cײw?gu+BH"Y!&=K֒$e^|ubNd){~ίN 3NgikĔ{EmU1xIc-$m\Ԟ̷^%Zb /"$G45Y_0G)xhVSLCfAC!EB2G$JwX(S5LLk-DI8̅C<;9:Cɹe+Zg5qvQvI7ϙynnl59VZ$pFוH,Ҙ->VRvW*Hb}wA)f5s?ÌӚܬ2ԟ)~Dz\uLK5́f-Z#1̮,Lm&+ƒ e*J*i]<I?ɸnzĮPRh4% F .k18uςGhprR]ւ"1N# J*u'iN:I9fLT`aYb`"'M1&$6cUXċVR"3 *{γKbkωY0D)ƅKjo)j&wDRgTC*gv\*N:*2SDx[;N#}jgȻιgvHɬ%麔scՅ/M)f둋DcyƚYqM>:~VN$g/=l 5hWN:;&-"/E[T 9bh^T'64op4&rJ V4콌##J/)ptNX0D8>Ѵ:1D1YaW =dX됒INt.+ ZϪu=Fec/"&1 D`2cRȔLb*1 }y#[a6F[hx787xkN".{.GS8bhȧ^1 y:5!נL0hsZ^Ѧø<>^3QհhPAe*t@īy {L*aKq>, zl4,-sU&IfI2nB3IlxXj鵯#ppFV g(ʢ?Hptq).޸YN2A@4I,*,.=y!5 k1{b^DǼ'qxeY|'Qkjroլf?鸰\K)A *WRϩd^LnEHwæ*jvb/i{/l~+ )Mޭ%-xC״RIxn B7P+=逳茇nV;hUz:{!фYy/ӮMrߕI:P./C/ Uۜܯՠ);?r/#"y,A5 $]~[.I:> wK=?w@7zM8/es *)*iK!{J g !F҉FƎ;qйƈY)O3GGn[O45\}ud&6g?k4.IIiQ%: ݛ[l n5:I^S\/JN28%{Jbf``Ë*2=X#5&֪i1p\}hpT{ZсYK.[myUǐ:rVMc =&APY8SnMFRpQ ĶkvUYV;96c$ry7&YqZ*̸~j$o+Mk$ Fҵ{Ê$팚< {g=_شc^.=0ݟg8Q#'CM=w#zC{}E/GagR5DTS{?v92cF4iuϷ,1wb3(uouY38W]ĤJ1=5{ x"E)&2G#JE9c$&gn̊`~pCҭLIY {MjJjDrg߳X>3Xn3 JΪ(24E.Îo[HRaw+t3vǗf8y'ט%ˏNЂcYH k c[E(H?ސ=GtuMt㆞ p ='͛ˎ-Hp,T/pg՚9ɦb37aʷ|WȓqYC.6o}aOZbVX= BޮQBT3 =• E&8#:qZNZ~3ܷ!c~u~aiO前Wgt}Es ~ ~^2Nt A*; N)nhjl r }!;bfk(vȿGCW݆z]8N1W ߟHfvSkѩ:Y`B>&uJϔص7Q*4a)& ڊ܂V=|$ .0Jp8^&޴48&\:ٻҟNjDɁi+|1bX($!FIx(O5;\hIdQH6G f`sZKT1KSGfM3&ػZݝs!i3)lnur&0NM0Un(Ltѱ1ZW_b>rKۋy^,릿S'䖗*Kݷ%_|0?xWػFn$4 d/m e L:0|ȖW._[(ݒմG&U__O&=K᤺ 6_I6.Ka&c/Z7 ubN+M{a8D?~B ?#'?'5_דs;Y] ?loLGbi i5t^$C5(䛳_oP|rp=nbS"mǙhܪz8\\>ĺ]'}3ŵY9l,og@\AŇ^NG6 U1'sh{>>sije":MSxYT536cqoƾ\oOzp? /'q8.pG Ξ`/ERʣUY?kpTbnL[Uq^&+?NN@[z>-&#r(UeذkwD G_ "]Ib4G`Fߝ}dm$,c[VdL/ecg`\16_3 |JpxxR$…5޿5W͆ /G+4SF3AO3swsӗ棟ⳟ|[T{3\~xzPഊDt[czbwH̤kd? Ch|L|seǁ\G|s73h74q'|wUr tdA 0-4Fv1W2^_vEegZ۹ d&p7B|s04'sj1,,AILD:k&}t!FcE `(OV5XZ}B$ˋLn@H$TDeiO}\Q Ц^lO.P7pfbjͰ,sKQ+&zm郕?.mڐv^0Vȹ?ub-I.XQ,wAU[g(,`iJ(r`VY9UgѰRq+ [p (Hʵj!9S Nqc&'x/gw\1=ީp8Y+1. 0Ώ#.j1 qF͚MÌ|Q8j ˿56oxQ!xuefft<3OT.R^u_'%^H Z#Ӄ& _P''<@50?ciS%ǠMC_wqv/b]4G[+/5ju6K;`nQ:Z.tt\23PZ\xEE$Q܋14 9PMkKzTk[OsFj]O;6umsY!PNA!򙪊|lC^yiLN@i4IS4乭Z$H+Xx%U>ւP&B%HUD smZ"p>Rꌬ,#ÁaR ('9ʳRZв6IoJd9AFh 2y>c;F/#m.XIr^:`n%yQɳ\8`dќ&Jf S)-tm JOOQEW~UG3RaV17WcdYz+gnkΆM!WysrD.٪zaWWzR\vU=P?E~t|_Yfj+'$s&<>x"ol%ejw샏~}O6 K,P)Q9V9eЇ+%/=SI+^yIvk)Ԫu>@"bKʼn/hMmӚMO8v:,:#|-y;:Tã̏T\q{Uɫ:_SE_w}Xp_b^(*wk5ޕqx4 :/1@uk.~t5qX4Нהz#r/'(QO&[6N;ezJY;.>7#Ɠ ɯ7)޽[V@%ye5Nmn~]Oڧ2ȗj($כl׸R]_N N GlB}Yg# ȗ9W#(ݶ{2xd;uqo|O ͔_$`k7=| aɌ76ܯ3*gĶg- HzOeY8MJFkpK~i&s?,:( 0g~1E;?K^y S vgW޾76v~k]{kN74*һ7$7n>BSI߀{~8GG<] -e2ﵳw4| "5i*{9G7VRPѸq/zu)+^HT *W 46N;CƛޓEX4U,y33:|{qx-N}?>E681N:zY9vPq;P %4VTQ`F P  mi%7"(x hai) IıX,YtZ( 4)K<[rq=dnEoh短P2SW\Hɩv@!Gw]Op6]ƜgB4puϢXFf+Pi=-Ҕ-NB#QT 7C(VQKmzl] P Ysdhd T`VDBdaf8[ @:r;$GjZW.i:Q(Rtpح<@\ m` "@"(h)&J bl FSIJ"4 gOHW| d;uKc܅0 tz!-AB%CHe (!l !Z-Fb&E uޣ5Rᜡ\q&m6B!FLޮ}(#1ncYzJy{f3KLXtRc+? Y]ZGQ[:Z737Z۝8Q]-WOd܍?5Om8}&bI{𛉻jQ - [羌A(y/wJ4"wu >$@H#7lhUȦYӄ{ fP􊟫čdǏT ly@J.~ZkZsaqCܽPnf`}/m MXH,nVrh`a`$).V[4GhEųR i !В.)IKg*H;O'Lz#ٕd^#@獕{Ɍ6[:ǹ1`dL[KA4a{/-w;*3H"k:05Х kv!Jsd2h&XdD&rKd#'kx'q0/k0TaDa(*Ko%`Rx gU#yçQIaLJ b+L"6a/=P8q0hv1Pq >"m/v$^1#XItɸe)%6%ႛx ! yj'  @4%^Z%0xDz WƓUp0RrAi -;[͚:1]kc8<ee,#& 6=]v0VDR`B6 gJ&%.-.n ƚg!!kHBsR:7E ء'v, @+HLKn 2H[s TR) R p@FHN!!:!Җ: )EЎ33k$e(csGe㏄fT#gdaC\o>(flkax;ZwG ُH tXG[b,9' !(6D55\imz&yʹ o$x6 |H2h0Y$Lj؟mpRb_kI{wKxnw떚H˔(-VNɲ@SII8!Ki!d$ĉ޳#W49r$s;n6L>Lvt^Yb$lfz%HXU,eCJk󐍡B:1) f$d$RRӑBa|Zܠ^!T7HOK6HNɺTuF=g9 j}:Lyyu PC4Gq>`>b 4|Ǥ`bk)~@>kfX NG{(>DyƘT1Nϴ")9p;TB6Vc4τm P01 S,ȷ iGiF#9TNZpͭ!Xֹ*G=yH5qMIFrؤi6B34HlC2|6Wh-$-~'ġŷD#90!'NJpjL9IxF8W:S<*4ԓn;NHhX +r;S%2j0 VDûV {܊Nbk8o;h9-n{ȟ{Fno7hiԖ|ph ] w*!Ї)'mn`e\S!%J+)>tk;p}r^Z^rwۀ'`~íOGBAY2(tt`x$xxIF`wu)uKA^RQQ'Lŏ*BVHh s{Ї*fD"%~ݜ.L?[Cf1df$.Ƴ}E[$6`R}үBT-)TJSa6bݒmUˑk!̃IH(ݦ?`J|N\OEI)0~:sڭHQzP*,)mCSx!{F"%بCstևe 1ZFV׫ŲF"Zm^X8ٶ%YFBKAHҨpQuz۞pUį,`hVc`@Im4Z^ VNE(.VZ跡zB{x湶aװن #eTn7wJLB=JI4C7_/z`~K;VW.wln(@61c Y.^LݩbxhBwphf`!OFpaYdJƂD񥕲Їѐk[|--%xKT u?է$.,<R1}2M%*&cwh}ajѶgAb~-L !#P:J醡hU(THrRX?U@@\;^v |QI9JUI,͝3%VyUR(ޯsX1>羞roаg1_}kαSug2ɭ9^l&^- ځ+٩EŵU/z,VTs-508pd6 uHϿnb ֭dpڃ<OQӠ3wYx&%짬4<~^JAE\1KY槣@SyNe(?j`34N,( sYMj(2k9)Ձ 3Mקkn;cNwSh|n|5!#oe 0kSZ}MawCЀ`-%'?úe~}ivU0e0 tM)i65|@ѦV I+tP&%c31"ځ9 v ѠA 0Pݾ#1=`+LQ=P>`R-`f{Qmf`ݠ+XWP6l` ;us.A9l$*LՈ]_9 % m%z{}Jz}G[|gځ{*Wm /M45*J^(!Iw]ӟTSCNHQ/rt^ E1)_V٥UV!yv|xTwՂja0e&8'rV=WЖ܆N7'?Pf\Fua^$?A`Q !t2 YŞlV$?4Ӕ6A5gF{߼}݅hyy=9mjff |p Lyr;;\&El6_,.U8]R,C \3fiv&Osz5 Z-9+.4Y +a{g-b7jB$kճte政>Aʊ6?wc-Ma>D>(w)`N2Ur' Iު'-ˌ0wB銵6?wd-ި؂~]LVXp{V1KƀPaV%~|KʶlSd6.F]̋Y0k8,QvyUmf)pV]U+o󬘎6V 8eg\YG}$y~_W2 Ɵ>{N,8\lwЄi֬@h#_x@ts\K6f &d'suq:`ּ3܀Y.1p;%`쏯_%e9`N 锯~o~]3 r,SEy0+롫O Brph>-Gٹ_>=o~g>Y p|5BDv gO}~NS2}>Bsxv1O*f@ERi_WyB^uA*xd$ . ҍԞ]Va~ouZ0_n_̛!A$Shaչϛ`UJ}y2)/WWieHԠ]u5|G/Ք5 W>SeG_U"m1_Zb6-6)LeX& o]n:dAg[&72kPq,A<>E0ĥ)ʩ3!GNkj7[хcIgdG5ݨ$c7@v' wؠTv4U^wr\N^d~\S+cKF"cQpm9ʹ"hYkp3U>T̓3]&" i[9|5Q(]d ]>\Qł#0\e|RvsmR*Nj=+t3 :Y7f¿a" Un#+:_[: 59{VgE /{`{8"j7 ĉ?lk3+`V*eV6xF+./im4Xy7_[~^3{4*.3?/u,|HJ8mK{5<e9Pʽ@4fh5xөף/NHU7ruLe4k gC;rF!GctAk3mc1(kQo__UW eViNLBqaV<˔kfk]y< e##30S5echf?#5?i!z5f$ue )@s$Nh _yWe4iT q_ _ӻ$(L=n' hّH=1to"e= $XJRhT$tjxN*Gî6@Lki{<^XJsvq6:^3hz:qyˀIaiPN+-؍ F5]7׋́2<#3qYj4Nf1gQͻǼ)W0,Jl@ xchrENSH*!Df)b\pafhk1FSaF5r)IXPi܃Q'>К{DK)똮 Ӽ+vޮij2a9,IKa!#30&1x0{ E[Xwm#@zN/a<Y`} &ƲhTK2MIU4Ms`ϙbU.3E>VbfE.JAK4' χtKδҴ'~&m"2U/9*##߈ WilQX]՗ByW}G=GE>VM%@^H&`U1如a @I&G[@JHΓz}W\ys<߰7(y‚(wY*ת{?9i(JrCP˜ٹblN*׸1/<XIⲦ tT,w*.G>'PcURmEIu~Y_B]fe4 0P#Ut'ـ]Sv;0P13 LA^p7 ,|rPD葑sN xT8bV"= YQh]xHrqck dӸ~sv4u`r.:kJeY%ˠ_swE}K6GOڀiQU {`T$RZ`>[GE>Vv{MOx#eM_F ̓_6_̨ z?G6%9,ZF#$ U8p}Ra:c|Ehuuf6/n^;HL4 kVPhu<ź(*wLM`8lu<ݤl~!*b@V fT$(`K!Jחg2*U8TUF5s.Ƶ$RQFlܖbnG>VjPd&AsM&rQ,X:3ϷK3~pP1SgNR|"\a?e;ϗ^@kpnx*PbeYJ.(2ц (*6ET, Q6`>Ǒ?Eim˒X(uEU%+}GhGUaGcɢߤ[V ϶-UqSA px ,Hzd49z=H ڕ%p7j| οA$}M|N}yLYC쑑o'JP'Xʾ~[LGZ T#>!Tal–d[ ꨹a %*Ciڋ!  fA o[u&'[;|'Wf:h AD `eN3ŢvyϷ"|Ō=v Gm=2rtiCFf! mOʀ[Q[ v-p+I, O=[Ŷ?zȨzZ2Ngr ⋑ ,WβU0eoE>VzmÀ+xY[ zTq"%>:py_1Sܫ& {"k6j@,GFƥyq6Q&/rc/ɔ`Mai9k%MnrPuP=r2bJO|cq>iηj*%sV"T.eXbVXd,#ԅAo 4JҼHFM쑑0AP9ncmo< ZӇC]r*jF~%Glt}NoiUik #ʇR0O3\nP6CY;YGDs Ӆ#z]jx 4Q%s003P}7 9]z.^ڎwzMXa5l%Ñ@Fv gM>|Hfjxn7r[CRo:G PFheXOXTtHdK^0<0vxgr|Q UAr8TBUL܋>2z(q*ϧrYʻÉHSKQԺZ$fsxmoE>V3._n$2{dd{2C^X/4LYx ;iܔ̢P;"TUHTrw\#deN',qHv 1#8>>V #grrƳ,vήs{U/a@Z"fQGq6aFL2لe<zQ- _l>t&LH;Mʧ XhŴ-hky-GS:L-.zdd| aMOk<3\1J1!(4u+X>aqX [H]ߜ=>aKAP"VeIɩM.UԜ1n=!m{BbC֯ 8+Gf9U}+@O}k~ 8T)+#aJeeO@C'C>V烿wnVM ;{b~u^ŸԐS\Y* X QFsѾCUjE͘30%x_Brhe8ƐAwb9%RJ @О~q RT#n(J5!PQ!9L+#s%+ j ;[l|s[P]zdd{ 7O6}qK WTI/J\Fwj1 GFΠL8G9TE_l.Ǵ sO@#9g@ndxj9vœz\ԋ8 7J("M Y`Me)cŷ#pE>Vd|@ͱUӁp0Fl9ZvĠbq꿎u$nT r.dpT*p^|\ WPd?`HBaM뎆 q>QcU.bYoyg<*ˠ7+Dg&]2 ఒ",W&Gd\oCmE>VvqŐק Jl*-2B.zQPN'bS"Lk 8'09Dg`- Q3xν :cmSD)cEH#r-5pPcrQFVv;Ra;*B:I<{8P)`#in")~0:e8s?.g>K##qIMX{g-A8  ( x&RZ`eGGh(*7Sn&Kz)9t8p^. C0h?HekɒNe]\PDҸ!.v* .!=2^l{RKƚ|/oA0jNL4-r/vՋb_8+ =wƫ(eZ6Uo!-8Z*i,ϼīrQYʸmݎbkW!߮Jgk.-]=EbUxF"ɂ:ƒ#W<-i3>3mW2+)Gn{}{zvUڻ6:>V*HF]דWZe"@,Mw zÆO*i?Ӫ^9}= .]qVvQToS]{{uNs}_3֞x;ܾwgJ4-^Q(/ 8ƹzHZ.hQW-6˸#~SD.We%E3 Ե"6S?' e_ԐNV9-C5'A=2~ZKbs>J!0sGnwwxh2RObks >:tCFz+_9R F<`8!"'RВ8$)l3^q[D8;RBzYV'`\G8M:^jT<|rSO st4GUGQ"WYnүN\?Ե/p_/c~-ݬĿJ €Z׿67Gc9V9G\sEy81H0þE0\lo hYT7X)nJOCz`pԎ,/?o/q?ۓ~[.·KVs:-^rǘ탲Vuϓ_3ϫE3KlH}+?UDuMsXDו#ï7ZK?_F;?_>(kVDzC!~U麩Fk"3bkgF+mQ˜~n$oOpo"3މ?A{oaf)=V+] E NR^QX Eg7şm-L*K>q^үɤ^]yhT4FEQ2LfFpzڴMs4 wc@q3 bԳ5Űvlj,:lbak> gFl?N5~VmoepkX Eg/1oBQ~'$59dPaR zk{I| FZcAq;b0ջ7`Cqoc5q_9e2sΊmvF7#x_ f3(#MAYR4DcKz,_*k@Y0 <SVE+z5hk,"Z_bvE:Wi  1֢*ng(03hoz-^GP{fv㽗 |VN bhz H땔P"ǁؘ Eqǘ JF犃nIg^]{n] 9J. 8M dTyoTt^i^220}l ZRYGzV_-H!ڌ;Hgu×tH D&缉D;wl]/h_@a]׺Bc}k|ET;(U(jEGb5Ec-zx/0 _USܻiC1"R 3 +XL~fak/=zk7!`ғ1r{t:vuǼiT a Ó]'fX+ T= >iBg0fhT3!-giF׬"}Wbu7 60>1*`ec.ole++[y̸T)Y1{](-}ߓ0LP9X>fp<=HcӄGAsiG-yf$HFFTK!])C+<$Go}@C람)kh\x*YPI2fVʧc7_{h}r'cju X ᱓̣+NKHt!HWKL[҆e~xnhA'D*c@$dpgFg8cCk8 yRUРG'c~#GA aC^9ڣŌ1Gye W6K5D"+1!LycKƊozS}?9&]6$]O^Q}|rV^k#Ů¼ɣe#1T+, / =BZBޓI9Z#}hS+&xQ H냧 | hkc}ـ-.I:V'cJ^6Su"08COflvQ$ư)Lj` D2%"g"!Dϭc̱W.|v譡`聀gGg@P!JyPgdbIg6O섀)Ns ;|K_V돏^&'`!9Ju~~ ioDE8"1$ʒ|v8+yeޑWpkYZ 2ّ1&>y>Nh X^0#UH&QRYYUi]y'^j7|vq ʠGxR&-p/@AeC[+y䍒1@)K<+aq̬{~SܰIb N?<nM^==\}IN/Z.-^D9`E)5,>}H$K/+* wg9s y$4igZG鼏8 :inX!%*Cy/}Eڭzkł@.!IIzNo0tLj`a≦<(F1,dyUR'kƸ75iWn7OTg;n \ɍ)DS#$" AAϺ< 0`XɘdΡir&2i҈Kp8q^==JaIr%QĚ}ˉJ dN+>b)RۀK@C% vdLQtϭYÄ5L#j3_d1OF7A'pޣa FZPoNM' q95Emn:7MI6Y6V9ġ0 xWͺ>mEOjw;W|JR*y˃$ں녯Ym|$#?-އ>jl)b('fVC>dK㱿} /գ*AElO,17-g4!)Id)$Ϊdtyf YHa >+Dj`wSEw7,A*#c%$b(+&2A$2HwJ:I"Jr[*OE 3;{I-;{tL_jiey!7{I7 I@ !uyZ,:HHBT&5?noW|/ۋX@E, K{2&KgĚJE4yJɁ:hٯU BArۻbFS덐1/|`GES|d630~|vJg)q7j6mpQ0(T 1 H9r ~QܣEq2V jJ:[kmODM`q%#L S0B$U!VÙXk}Fٌ1 GºnGA*=.C @?-51rǖR tbV%E>C,IYeOfz%e2`3yi)*$IAI3$Ys"Ջ,.Jk1KoO'1 9+iT,B)K%[<4$S/)iw( _E ul0_oPt/ ȟ6Op'%p t8E-Zv 9Ozs`xJzu})?D w)ɘf DC*ݼ-*N@,9%I4Y][\v/f/R|3j) ')gy3N&QyrEٟh7g7_7F90boLy,3EMyM)}yj^T}6CQfrm;$ШAkrvDnf7Ricծ_ZӲ.mf˲Pz.E]nf嬽Nżigz/GExhMOrD7jP94}[ODnQ?~b=yev_{1?ǹbnKE!e$Q^VNnXt>8u~ A:r#dR]2Ei0oǻz [8KLVOTtá;0Ų)b?PChdLq!0܊n#b ×laf*_Px3{M3K?]^:oivonnqv_n'5}(w6_U[fwuTL(̀b+ x,,奻"WUe^E} ׫kͶ.3KTv~|'Vi{1J+&.gْnus- ٥3R{mMi(|V˲ƗjwM˂9E QY%D1t^i}!>(#DZ(㈱")LE|ͅGsU l[ ߠBP~O&|b6*<>=}]1G5u)2J:e!H"qA+`68Y3ƽS,EsOءGb#c Z=X}“@'Rvn%EVgsG?pߎ 0=p3(}Ë7=7K?]ZVBS)G$KN''&C$R2͒v\8nևF<gb^ xbyaGsH.$2ĩ,x^+=iqM>3ZP++si):ʟr}^`t'Q*1^Ԍռ %occv9~\`rr_g8_0s1ȥJ%IO$IIɒ̬v Xr) ӆvNS;=*#H +. I95>xZׄ?X\ƁcZUHfni,YXo;l~WWf0ipEvsg7i+f`kGavzo`k giϸ 6x8>BFymcl_c~+6)pIIaIx"L" .NgWaqw~\)xOܝ_chg:͜ `i0bh+&M^ۤ#uQMJ 7툢5;.9?ݜkLqk NUKlC,~D}XI&OOz0|˽"iZޝ?͟#7bKB[a{)XIx# `ttHLzO6%"!0N'Y|/DyoDѦ5ؗVYT-3keG<gncz-Z{bdv@Q4E NIG}X .ZҧJKwJoU̸o,?IƕjX#uCŇM3]?~|2yG58?a*}@=Q}pdu-'o0abS 葬Umti%  ZJ0¸̜(.T=ݜk4<قF鐯pG`3|f˥헿_GE_[7k>ى7f~⎘ ?j|_PK+o-3aP 3'ڇwKTXktr+H./]%|R9\J-/>p6LE1Pb!v?E8ROVG7*]ܔM99ZlH=` ^eHCIi$aF|S1O?IbMsƍ; 9#=:Tp3L Lgɠj%^;`*14O#VsL*kPsN;ki"/ ~)}#7V߼@HvɹJޣQJڔz>i{"_33kɼ]5|~^rO#T%LK `Ьp+SY}ֈ ݲWRt,m'H(7_Sy{/m:pЋ #^Lh@\dgn>\*nm>_]vLW⮘+f1bp]wNC縄K⼫E QFҙ{:Ѹ[h9'|Hx\Ve7AEM"i^N}p8r`{$! m5eg7UL$OnA=I7?-B/3NTgiY(ᝏ_f)E%tAs[.kwAF@LɐB 4 "=N/獓CyDT]5e?kt"u_c-ftz@ 7 zNrZS}Oe_a1W7׷CAw_"붿+%wB)ƅP> D{BJ\/eTs ˍ6TwW;^h߉|dWy[BҸ^w[EewCZmaLdqKVwwmm;'3ʿ{}COTà}0GcK #jNAp9WV? t= l;3hofJT:KX%?yNAj |\ 鿺|S~_AR$ΐNfU.c~{ҥl2+㧮][F(VAqMg%m'MDn/;¢Jk%Lˠ?^ @^okL RuZsޓ6I<n\JQ+c`/x?\ׁnkCkڣgS# y{jpG,tG+"uՂ_'4 ZfVz/W K%*(N;$CFC=CQ}=a'm7@]ǃW JltX9Ů@u m+o , F)y$pFG-FW1D2[.V-O/nAl6tI_.0|Hy d?fgכ~Fޯ2~5NF 8[ms 1eo8:CBXUXxȐ^dfN|x֛^pe'388ڦG ΋ e.mm}8pe,c=T [ËjFD;#4ŭf$HFsBfi[Yxu'Y SsBT;W5W[cLvD"6FiQ^@EE[ڹ^{7P5ǃWC\',Z6&5̺2%z4@7}$MB`YtP ,I[קaT+倌.CY41;LYIdZ+ꬨhl$ҩS`X !vxp"ʔ& tjFaQQ"%`C5A:_^F]pe7|q3oj fbfnw]+, p`[9#jt683Ai#Wڈ_)GT2*#e1xW EP<>PjINlL3WŴҴI?H|zBP\E–6j5␕Rk*.16SJ a I2{q88ky8TwqS$>{IgqrW#"- &geӞXW&Rz2 28Vi27 28Nۏ֎TFDYxsW^: O'2*#ʃs?ތ^k8p;8{9\L(⤣1a3 Ȩ N.jNJyb.2>HȀ򺍾g<862ꂣ ^2*#$(]UXmF%na5_.;[ixqP Mqp|F͘:qlF8d0hIGuqɅk/¶!DM#x*jrgrʧ=ߤ[5eN8pt]n@_qi{)n (yuL&PP`]L^*M{X4CB*SEF=X;(}@Fep*Ԡ>:EiH0 *8"uOADuOTFɸcP1%. f'NY .TI8c0Tf4ռ!wҢj1&%)6P֒0`9/['8"$bQ+c)+v@F]p>n1$Tؕ%ƲC0ĔzD*1 28/{P =LѾ:+],&;Rt]Yie1if$UkCfw@m/1 dּd[\a0,2>Bˎ5&56)/3_o|-"9ٷG!RvJdT5TNJMGc\2*cʃo 0 Ȩ [׎]sW~,ØzG$(B$?QI9PQ&MhQj.k8Q>u"{mr1"xppW;=}9 28.pFZW< ݀XYo:m(OWܫ :mJD&C=aJN,3 ^Fep С~msW?U/-'jdT58 úcPlPGEB TCOT`E~6Sy S4$2<=H `\5Q `h.< 1Z[G[; \X'kX`B=օ _nD~JGrޏpjG\PpWB|L<`@FepXy'|;"E9 28j ׫1E]6f/zP>qN@eX]Q:Gs_.ۯp+Qqn}Iʄ;?~]\LL-OX,6 qqA%2.RA:VdT@Uʨ|USh(6QB1[= K(ʝt15 [*V)ִ6\PV)?q6&^WL9R#[:Hܤ`1:j$'ks5FA?2 */XXZQ>b4}Ğ]bMPxʾQY*Pha 0PO#X>1h`Ղ_1Ną*DH'XLicx@KiKh\n3JZ$?-S˨ N`Of>07Nу* !yn4ќ)ɨ +1:7  $<\`ϠZ3 2(Pw}A$Sd|^Dehbs Bw79bhdTUK(fUK]gZx㸑_0-E]Mev =P%Adiȏ,"Ř,u5OB0儰pZ'e04ٜ&IAQ0DAʭדrQzCw8Rb{w 091;Ed|;n9..Z˾*^G~2Xɽ, ",Nq$S u[<}?+c1\m׳-&v1wocp,@3R `m( ~: x0}_g!/1uK=|2_<%H⎲gGqZ".xQt.?.a۟)+" uAb )Tim\m($^G0-S7J)R1 L.n88}WΟOw#JVR쬻,4w+Pu/ϣ_wֽtTRƶ^tz+tdu6GOFk9 <$E9.<ӷvj딀1w ><a߼uٍ*l Cfߠl5Orx'hgGiW 9z[/GK7.~}À#9h3QwxsECkkL[!d Tz[qWCë˥/δRu)׿b dG 7 E zgdSί⽀`X{jB Z0AڪW/,G[xKcWyQk]p>O_wE{tO};[`j| ]Ov~*fme?dt1qu߿;s?z:ז y 75!s&bbIb.*1AbX'-uty|W݃cbH;GT0"LuMxoj2ۛ^Ѯ$\\Mfy=#޿3{J<KF 9t<=u8}3ukmᦶW_ZNR֛gHY{]yyq[1k,n5<ؾt4{D~'WӍӴ8JuQy-q-ΕKtn|m/"ލ݌)Ѐ_VEm|iP_[Rj'yV35p2LN&ѽތ=݊]m9:Kي8}z<_b1%jE&?]~$i/`Lyx4km2֒2v/SEnrn2-]xM/Ţoku @؝Oѻ .GL36H㜸/ZSE#Bs9˱*k稁!8DĤp !orMT FԦ![d96@c`m/ۈw.zfe{yn*De!jEO~,C{3I%Q}TM(-S"gEA= ~vU.?!3& 뙱1$F$NdIr.<Ş8b>QOTv(?h|eF8F ?nCB*P2Grt>}*&wyx=K{085c!MSUc_%E}១E4V|fڵ@+Mdm#B%V'r~̇%|A| )yA4 Ws'rf13ꉲQLL rv}pZKkƄܫH,m`Or0 Mlfיּb`O06mg7hCMH }C]9< &o4o?lQי$ xOhݟم 9AiNp$$lxF{`Uň0BZ$J%k 4F?e4/Xnfy-dq#9 =lhВ^vevΕ tu)~~ WucCi>dzu\ 0u 8P+m3q2ؓ YA$Eb?t?]Ǖt< y/a Uo>&l>iF O9cQFe?,x0ƵTFēH󐐈RcEdhHj_r"v%jV{aԨG<5r`K$To/Ö_ xIquNkp,04ټ$EhDjeDB ڛ"E (Ima 9f<aۗH^Ql($0R* *fDHÉ|*)VNkw;ڝzNv^SW׉JD9 9قGLnH@B)!iT:dr.`p1zK(8Fi>{FV*msӈc9ۨL4BM#M=وŇ2 AqMB+ +4lĔHfsHj#LB79^ /PmzUi4bDMvc\\egH$e EZX 44,$vW"ӪG#jZf;oFzO=Ebm>hÀS"x% >p!WDkv*g+ aTwZN&ƽ6ɀڗHU;2H(m(;I[3@B Tc$#K$tN2B HbށR&AXB| b9)"C`(4R%U +/PS:/03hGօN 5K@£($O(*^8yDB JTOs 0RQHAz`DBF>M1QJ ZCzlfh 9%jת 2l-6Hjf9 8gʚ8_S1zI!JVm.N``$R7G zɰihteUef}yT&A&2hNn2aPr p%ol04-`FG5ix5ɳՏ)͙蒫B3f+TODvZO&S%efϾbR$vsn ͋5u~:#1R1]1==):34 ">BXH b!U,T0c$X2  Òa7ÒaX2 K}Ԉ{M]="Rީm6 0lvO5Pcbw0` 0F 77xoph nop?k@cu myZaMzPQ}y~f8T~UozdNHGDEE :3|d&bo3ϔ>+%%&.CtR+e< ,M(b@\KZyIqs63o'{P2Jal] Hl=cZ˚Rx@W/x|ѡ&Sb#l#@R?gy..G3O ICcзODuhٌs6`[ɰ:JEG4 NʝFFB0 ~Jyo/5 R?HDWG'( Ds]Rݭi ƦHIHg"Ho,ٖ"#sG˝8|0(5Psr@RZ>{HMV:[Ϯ=\@:) 4&! _q8 Q0نח^J$>Y'dYi]4̓ P¤$W+]*+Dy{ ZSeD^ 3sTLLZ]JDmiDѳFnYAPRx 9eLeFV!,9I%U$9y8]S& P^MuȽX# 8d(3)Ì@e),#r=-*1QxZ2&>qZhTT9s/2pI@)YXIbB`,'$cJ"[Z \{_ԲUf0e@WP}+roNn1rofW.-wl^FF&Uw[A[Vq1V0^ŠXq+@HJ1laK [bÖ` -1laK `Öİ%-1lu`_wJ9Faw]v\WuͪECXz^:[J. i oV!/x>AnoTM)"!TpRV>j$~i]&h&tһYw􏘊VFc;Ec;lgflgl y[A|l v6`;ala;lg~3lglGx,QTI췂H`췂V [A~+oU[yHS*b [A `췂V [~+Ę~+(L&bO?G#\kVR+ϴ.%j똨.v$3<v~.G}K.g/Kg`WV^m?w_svIm(Td'0z1ĿܼӖR߻oYo#)A8f[ڱG&=vMGxP֩c*y1a l9 Ӵ,߬pZUͪHn[]. ù)uDyj l/˞V&/$[np}"9_T-—&˕z>@xoj;` 8XHծI7wjC0xͰD1ۗN&W=Mn@vΊ+`٬K]u.i5EUUf]VKpԴ |-ooͼH߻jkd< )w~Ӿ>'߽+`RS4%SVhFS jݚ/uJr3oS-B/%1׎hǧ%ڜپE{Gv(}8;g#Ιt[{;#A^\};>N`9nY髦mq{?n7^Qs =4T&4еbѢ/XMrе{VLF$pbH$>b:%7\~{/]M%jKf%j[j{w}OcMzEZO*p)eӨRJYYihv٪_A5$9JEJ\xOpѯ~K>F~hE rw[f3"j=9Tdg"RFhREpI{F`6s*1%AҭV7T82XK0,!).hXcpޤ] >H6۳इ>Gcʰ\$ Nx Y\2HSQ_>Lt>ݬN2ҹDKLkdJTtZ c =s뜣IOBy4Hm~BфzlRŷTOܾl8j-g?}g/_Ͽ~2g_~ξ`fmF$xZA~} -F1a}zqopzcVlmWWm ~6,꭛V%2:5;(Vq b 8_PTYpfM&QjC-uS`rp -IFi5 +wTFRR| )Z*qEK-ϧ6x`X.SqU"ShD 02 sHuf12KdffWGoovzox|Y=YբK5Pӯa84Q`ya_o?{uyՕue?)`z?~"ACjfh?5tosɒQv$ߺt0ZE b" b" b" b" Gs1s1s1s1s1s1s1s1sDA b" >\"ݷxZĞv?j`?+=x`aLG Vv6߸ ӦS3էoHBb K_1tG3lV]S߸/*B:fk8cmTjZEaey}6}M_)eky)aZaiټ}?ʓu駶~t_mmн'mԼK#K!h{|>Ra;gہפL8 GO|.pR$YJ3X  ,BFcBfk/Jٝ Gn-D\rq5zKsn>rz|nG5cLֈ^zPcsc/;^&mvP|(S4R`@eAM)qzmt@4uQ9m^X"%9d"`9=1BIhϞٍ10FC)ORqɲh2r/c29zH,Fm APRx 9eLeFS @+`cIe"bNNx-5f~DDj %3Z+Jί ħH 3:貌ʃ o@<ZhTT9s/2pI@)YIbB`,'$c% Wrn#8>5hkzqa﮼ yrZVPKf̫*S AdqWB #a̻13}w F[Y> m"`J^\Ofa):=\ނ;{/Y0x#@,HXBĎOj&}6|}LbW1jkx*a':>xf?_V+|B7rDMkT{|$㬥K鈯&*/,y>\#`.;̛v7#?5#stN |ݨ+u MgdW?ј|$cR)1mhZS- hǡ{32ӖgGbTN[D=\߱i,Zp1da%BzPi G/ÏѝOxU.?ַY2T'ݨs^l=J>UP~ڦٻ߶,2Ю0,z`7Loc{à7FIvF[E$RD25ж#Ru^<}j"S࠳HS8AxIj/P<\L '3c- -1I\ bĨBʃ4YW*sŬ b.DfT ƒ$o Kؓ0T SdҚo|N}F{//}.=!bl&iMP_q/*:XA+^_ֿz'` 1 KD] PُzGU0q ML; [xw^_Wi_mn^ Ы|5.kf`0X,E^։H.=NoW?z5zeAd -xeJ֬hP3/ 終'Od(ɄQ}0*-؋Iޢw^&_3$;ڜRyA#7/OG~*^ogO~N i/w$AOzYcދ({Pf؏{mo{k`Pp?6){U%_ eK9EqtQ \50u0]:߁Dc o_HJXz2fDP^*}ͯq8k5k̸?J0;$a &Z`O0NdNB'(3旔snArS) F)\Vݎn{L@fb5@2ML!UڝAoIF܆Y 26=~oo/ꏫwon+zy.COި O.՟5 QZY3LLJ*n=}x,㦬r=ш (2` Z DL/wVCoup b5-ٜ~! 0;AGS9DXטbN%B*6Q {53I |^UL&W WW,fLyst?(a#@85P f cڐxsVPhBƃʡKcمZX.%PckO{~IĭWYRkD#(z*1eJŬ "ʼnE5[hk 3LrV,̚c! ):b#cbdd% kN'(:[cI^u.9`K%Xs ֹ\u)g7on .f &c{e>OX:zk[G=ǫk ]"B3{Ȕ 4FA7ٯD ݩAbN:} $?6 ϤD\~,f@p7 C{xyn{ׯSz719R1+ Þ| hyПLkbtIv{1@&h0I|tc[ pZT? jk7ِ12|y>^~4 vI6:< 𸣎RgF_ꌗv[hZugq1|L0}ZkA3ٛYl2F,0ky DJp R:Sb+FDLpCp4LrmLcɦM(FK2S3~tS"1c0p^+>Qꌘ""GոAY<|~ԙ#8is"x8팾A-ɘ`Qk5dIHKl2S+YduCTRѲ3@4D3,´O&C 3cgqЬk̯ggM0͆F帱|DEwKq)iV~:\zڷ=cVBKXi#(Z @1"cZKc@!cl0U~#E@`<2%R [D4a95QdU""h7ϵ5 [1̕va<VVxD=vh:q̠Vr٨ Ŵ98BDҐ Jhc!*9 jJ`Iòp.#)(wZD[KRA''D Y`ɭV"C;eU 84|UhBT9VPr ^H ‡v6qcMt=Q,W)c$~1˘ p-# (k⃰I Vm + u8Ds3Xɍ+;oY+6w"!K,ֆ~U3^y@sE~ Lw~;]RoԒK_bJ/T S&rAMߏiVd*pq0,oGĿ S.G*,)kdB dF9aHHQ ~mҌʽחa&T^H̱ID{F[31KD:i"q SXC{/zNE!9H;*W7U2&KOvWCuhxm-?Rzt,68I@e.e4NU8F $Ui*d9!7VBo eFauD:M #Sgb4A&C|8/TB$"TRxV͜R6yUhFfƉT-:З.[̜F>:C Z(E6wHtM{p0 o1>k3Nq{Hkkٳm!0(M< )Iɒ\E/IA֋if*H#NڷaG4;#'Go!1@ܳ:јc =`b(-h0U@0 Eza:=S`*}6V۫R3]!SW6y>Gzfh|_lt e_+[؝uj[F kEU n$t*+/bJGhroqH-*IIkg»"AT8xZ#e~2SE6eh+p6*kmw&mz%3.Viv Cߍ;ceDrqolX/_]6K8`qC%qksz4Ǻ7o,4 Ƨ8_ǀvEtQ`r@1 /巌 YKWf%?| Ú.=Z)“>0;7MXLU::yXJ]pp$I7v޶`g'2Bg41.:C N}U79WIł0,KY#Y]Ϥ؍ro;I6ccDzNSB{l3OrEt@Mh,H;MԢ2ǻ[C[fNӥM p||h+USGiQW: xW\* Q++`'$;3:Rj|f[x͙l1,[7+oll$w3zD{r2{DX;8,Lo!>At=Vtvi AL LZ~m)I:ߑBpZ61CR*=ܷd9tl dAIZe#[t4PG9ҷ4 7cΘRz9' _v*8]Lƽ____kA02>fK. {o{+ aujfy8ϖcyaYag^)竽wbxR?Nsܴw=3g'7ńн]N к1DoOcGl%{e-4:q'%={PG?V2_ ^=EYl"ۆ&/D&1rϴ ̄>Lԣ}+7>'eeOmߡ'p 6ze}B4)w3ӞrB=M "$L 55Q$f0# e:+ Gf 5J)uNV!J-QHEV➻x13cM \N Fk 6*b$Ƽ\%o Q̮ߧE\>QTH\l?xd_Z$oK٧zFѿ*fJʐ_,`埯6C.;{Nj=UXEh͙L#}Ng SDN.{)(ԝ<vrs/`8 UC<P& -,Iva䪻wEu]0-xB34|*"uI ^~v<_֠I^l4j `%yxs9L!E)UoGwީzzT>Œkcq۫yuU8 RߊŅYPpw xZ9__}ӛ}wwջ?~ _E ]Ah^|@ӺeTM+[WiWt nIzЎjn-ҷ>;ߍsRn^..RjngZ3# tܨ3G"Q^)QC]03Θ>N5qت.Q,?dyPљf=a1R SuH  嘗9FTҪ-HXp:`]>u'rw*-^saG(WdSP,bs3:#^31^qeAGdNv򰸺|6flL&PPC}~cŶ4A=S%rDvncH%  PZ*7&4tebW 9LOä%'0i~Z0$Yfcb$23` W)p if1(9ޣ@;ZtX%̓<Ɓ <[p{`0Eχ -J)$ob;n}XP=ǽjp[Y]_d}Oj^VGyU~1]QIN\޶ J_ȸEH6Dix`,f(h dV0b2؛نX3e7j>h`ӛh;lpXFN$ :&quAPÜDB &c;B=qwP3 #w&lHNLƬimՓDny;LmxVbW/89.g ŷl$)=I/VC[Gq'7TcSTcxPΟBq\đ_hJL*5T86Uc%տ/}+e`k.PHMŘ8P=t:FY^1Tk%iMիM{Wstk =Rp8/u#T 0AhW in.%c ˜[_ׇtF?A''Y>s:NǪOOj }-feUBF}UG6v;dx󮪱6+ĂJ`As!aB0L^oE(Iu։@8i6:> e,<5DFL iO`vϐp$c,3+1`ZdK+x;OCu]Mngjh!Wvv40nPݱ\8qq&uipO ԏyw CPn0|ٗ hEۯִö6 wˊl+䳂f,e/mXO_qEv7q4Kٯ Z'~4pkw@:-T?9s+F=]=I_ b˽=-3hr=kM JV태^gPO9p|Y} P큹bvQ-ygjq^3Y[JxaSE#B^_hۯBUC/WnW-.+4˗5m: Lh,gc,2V2ӽmZ\Mr< /t]o#eAoD3vLw ҏRMO9Vb?{|f6^| 5m2"|@e6e% t_.\}ߒ=f1F-\S>mQsiWz XQ>jY_9f}fƈ GIyp!ڰf=|EP ^Oݳ!wr; 6` `ӮO^>+.3݇>w@ k"{}(rk# (/~]x8j rx=,qEeȷt{J6^΍aџm38m ;$ = }w溯#;`T P#3>VVw~Psj.dg Urfyѕt}wygaOA\`̆0BI~t݅`{1]]zՃ #0k*/տ~ ͚Yqj_GYRoyX|? a܇yxjqsV 2o^M4fӅ/4f2|#-| ڝ5y[||fÛEIaz5|fnyAo*¿Ȧ$C/B\vvGҜS7|&jnTk=*_Zycz9 *k߮: Wtl&O]oƹ٫%t q;2x;1 g9 IM ר]QQ8 ҒPNbXXH.q;jm~s97 纋z;o!8 ~t| M${{(;l%4#;5d>Z7G Vp).}ԖU`̙Kj]w+WOwi}q'rm Va HcU4!x4.P!`V0/2.Db Xsq`e_E.qVqL30e  D9s-(!\0G6Hcаˠc  #$$5pFBb%#1))DXҠs0IX:4mдAԴְ"Q;ieXaK`ވgСBJZ-W_-xS.@2ltIZ3`̃?U˾\Xѧ`gN)feyxqsI*i"T$F2?;=D ϴdP<WN ,qDq;ТgYr o37l1 5a(56ڄc" 7X}.pX"7m!=3!aD$6#+LXϠ_<(-UHta=<Ty`CZ߀~ęRꔨ!Qr,+x>1#u(h0G5!Vy1cٵ2;I'T~8IS`+j{ɗv]=;5hn-k֚.xj",rPNHoyR*o!r1gE|ễ@կuP/MVg+Yx&cF`KhN,[E0qœt/:yVf0^Z!A9CRrSc E)P?!Dr'Ñ*B#:7 #$ Fj*e;Q!(vE N&x'NU6^\&cwS`85e_zvͭjzǭb\1OSO5ٌMtӭe.]Ⱥ@-TD!A5^E# Ya>X"fMRw+HXG"\JlB O @J;kZ1h8zP!nбgc0ֵGW<{˂UNhcJ!D@%8Ɲ4?-4srW1B~0؁4cLcf2,Rِ3$弄ТRsKСcB)7jcddXD0 -J))$t週 Kya.z֗;Y_rqy1OcTb9R1%[.-~hELٻ6$W ~))Czݞq7a 򔸢H5ūHfQtmǓ;#+LR-qRv(s7&2}]2k_,=ucL,^\hIjS8:((|[n&$U7W/׼'0{퉨hEA|t88!JBO"*\ uyAG*.)9ɴ &TP+a1 A@  -l<0LNtvy䱰EE =q[xFR(8E~&Q;SN' Dyo֙8'H8H *!"X 5J)uNT!J-QH +HwݾcD2zXSdr5;uH gRE7}%c߭n6Qx9>b`gϿ72.T_(*$3.T #Ds)a2"aT >aa@56ݔ7.*`_?o\WShzj <=b)L L{.` %SD.ŵb.LU;zQ@ ].'צ°"袃fBzsTatVHΦASgzםE?JX|QG>י7"P (vy&_6?*"9Un`- sF?Vܤܕ9ɏš&wS~u_fsx2K=g;ۏ͘TiH~nLF^eIav$Wtn~d > &*&K>N ]uG'Q X?FuwSPGOGM@#&h|C붆CSvZ6W=!.6f^]7Q}cJݟ4wӕu~6]-)&U=-J5ܝU0G`]^}ڍk\16H~{~%^IYl,s-i# N1rX6x둘(q꼲ou8XMf: B{:lEz '^)Czvun!^ 1*5b{x]]ȡb*WeQgJ_7Q ~zZPM#/`|Em rˉu~n~k2iM7~E?zB3j$m0 cc=m=q#W |dÀK(._˜dM0s/1+j8rx?$)}*jZ8S}EZZQR: :Mlܰ;(Q*L4*nV( 1rAs`5l{Z͇ UQ`1|58rb XZ.ɟFiZ];cSa.O}K": :M(` O3ZD $8. Ȁ_H!% C2UP΂y@G 汃ςj;Ț[)IȗCk\ 7'K,A+xd!R&R/5eDDL4CFQ4`AY"eLD5@9_A0=/O/^h VM:^Ηat3jDV7[ИG"jM̨3` F刊$;AM:iW .;00b{[[펏ֻ9;LA(;qa@S/oې[DA٤흄I흄ib7SFp,&2xM#HKk[`JP+S -2JM's؁J9;ܭ vRĠK-3J!͍ņ` qk?=qM>[&7,OWzeop}4RUӗ@] pt㋾_S0K:"zLN٧׵h+G6'`ouBg´"O;+o= zQH9Gi,7YnRnNveI> Ȳw֜G)Tnvm [Z͏HY3#ZNx1Y.B.k;K ̂s)F.YP &T IT iV/1Xh o #eÄ%dG(: h$ sг{'28vz9 ATu/0-q@UHM7-.(RL05(V8dY>]J.3mTJ^Bw.O(QmGs+:2,"1r؂JSEbѸӛ[.ˌ›]EO2gc~$2tߢ} gWPjfCzO4`Q!`A2>?A  _OpI*j5!dz*dR/bIhpASH?rO`Q>D?>O SNXZOÜ 5ًْ #v8y{LPCH8~!&0ä|LsdZT[9##E|΃=9ڋU)1$~ӕltFv*qsA΅xbDZr+W3ΐm9GGi,s>Hg,4^] Icc/yz*23j\~ k!j;0vsφ !ujNm.sF$SF-)#ΈRNP2Ӗu^|<0)NVn+M7V w>{f+<a!% :O9j0vT]3CX`Ыߧ]퍢G}. /mR$3 7wbHضm<޸VeRV ~ſ#S뻿6xoοK, 0삑RN 6JNVJ8#7?_oU(o&EToe)jTK$i*rV|ufR'\܀@B5gрmP% S^'~ \yL5gA=OuT=溕V8(:\]^DrLX\>bZs&qscR HfEXx犞y:O3NlȊFqsL{*?)i8SPo{Ŋ~k4iJ)NhIsQG$h"⑁A 1"x-@XM}VwRdw.'+-֜IltGzke`y]}_)]f S"OE7zt5_&Z$؛RjKf(LJDc27JF}\k]/[80vY4AP OjRzp^sp| *ym3۽Cm{Q^d (fh71~o[]pi=ld9or͜?}W{W^xِڂ0&{_"H!Kc, Ơ" qe4B ȒA~ )@66ޝ՘l: ϻnF7]9}{?9>jgggg˽!ZϽ֋y);T\*Xp'Qm0E5}T_.kQv᷉ 4y ӕzPҦ5neQ"Zjrj7rjA~^-^&|g0Wc޹Jpޕq$2?6#}~k;A`!i1HeUARl3Þ_UWW~qek>2 { dtAjjb5*S6p7/{<y,![n5BNcʠn%I54 8FλpCϪɦ`=n'O%U=ooP2zE)+brK%bdF/cL-lEמi J,)4װL[[l SmuZ"46pA9B*yT ,XU[ԭ ,"\JlB O Z5Rʃp-1Gg-Jkν A:My<<"sc"Aɘ .AZ&,@| fݬ`n"p MhBZb; åh% V*!dG(ԵD_@[U!yt}:_{ bQp#,iKl\Qi4> :!ӱGcҶNJCYUV:k4rB VJ8 &@Jh%(K0,Z^R xD#xIZ*%T;AyB5&w{M]N .:7K#z O3P 1 pǥ` zI!% i%WDY-{!۝ys@KZѹg|Cگ7-9Lذj67H}'ŏo"cGh(K!kp4Qd_4)BxkoI;s0]B{Y 9gfѧÆriD3Fk*żL[h@|t88+JB^kLd^sgXK@4:&SVafXTZdZb@/4@2AZXbne$EgnejeE{ "^YP$& p"NvpPO@?w9;-g̈s9kT!yavZ\cY$h2Svּ+cD2zXSdߚK |O(bJj)Jk io7FɗRQUMFoTv]k%BBb.|_Z$Kfkї;0 `I*g.J0v1r l٨\lǪA _t3ǿ髓_^/O_p:=_,Ԗ7 ?Ed{qNֺ[[Sv"m>u]+|n[*o1 DC?-Oꦦi*--gM t%s=?drZT}N+p0! 0@S\UZjbG&nQ޷(!uPљf=aњT邦)"R+_ծsZ86//uxEɼQZH*o*JÂEvq^gk0ƧM2H #N+ ;0|ZKsau~cŶF-Yy[pysWu9XFs0u9L]S0u9L]S0u!.aĿar.ar.ar.ar.ar.araM'튓Rf<ܬGX˳G#}blvJyN~),47 u66 9-r@Nesk,QV8t9A&Ž~qU  c%^$ $e.{RJ5RRXftj\n"U)Hxj%| )tho5u'\= vЧMCw>a$[~yZ? 0SG0+S ʥl`C(J͖K͌;;!؇LJ)LIRM|"Te>+}E+{8L?#>zG ~?e|#W{Oo^E-Zjb6 BlAĽ :oO Er])'!1! `< _H0sEZ"cr-92_0/8\"MSYJCOGg\N⎥ٺji1((*lpULdDQ}3d8-q ;y&q*?cY zfpzx4fdAx2OЕo&7x)#jwZ: lb2"S|r'/ouuGl1 8 0'c;\QD>Īn8zY\P);htW+IH&4/q!tT[L﫤]?SmU 7xm(͑jh%FR̮3wC\bHEFtS#x!O.6[1; $`Eϣ쿾Bޙq'so|Wi&,w&mK=]eS}e/+{S5=\j Ϲӂ[~;wq+{=n#GBem#Yw`xcam暢"b*^RUŖh*fewDǓE[K)G3; M(\yi:.qX ֑BL7Yxq֥&^:tZҠ Kg8նVo&:c0͢s=vnfŦ:֪MRh䞮kT(<{1wNT6N eh)+fIU~14PF}X/LT) )@KWEY.abc^cWno/>/IH\$#a]F=FEg艩WJh0g<6"Q/ɚQpk/œV 7Ţ,~Ї6/p3>~q-$=p@=B Ll)%Q#B%%\uo1b>iF]hQ՗C,w Bm5j*>cqsI8*тLf$)jŝ͋dLXLw*$D{)5BRf&e EĎ$/su(2%gSS{( _,)>huX!ƨ}ΦÂC 8M\ e1̴!:af9 qC⒉hG )\R\ V,\QknTPjNC7 .9v_byHk>C p$ @Mύ2R Ja{cRnSp14{LsP K 95-?W)+E}*J溼^~oIz ܼRaELnڄ^j!u 8S6.t\>|RiL TZhcz+l^J8r Ԝt""-DbSk4{4+‰& ee_G:% 'EAtLſ)cNJ1 dSR>'R / l][ c0̷F"^P!I$B#\\nBȃ+oNAc`geF@)@TP\Xdp ^kfE4 lhR+n2e*>saar<h>OFY_S[iJismk É1}u.j0Of*f:ur!%'">>89C|@J(1;ghP IjT6ey!bJVg$"Ⱬʦ|]^ Z4z7+kVV_lvǂJnHf5iy{gޗ˂qKĻ<2!nӚ%.) Wˮw4Bzj]J\]5^;*l4vsxp}5+ztJݔ(u9:T`H z֠% OC=XOKFMhuv4Qܧ}Y3Ez1}y0{+\dV0~yWK\S_eE+WD$Mj2ɽpCglR T)J#@S P9(~ϿC[T6,x&EFd a%IY!̇tX?.!wKDK;uk("/ZG]԰ײ1p&c6x ! >@sl0N3.IK)h'"RL jݡv.,; S )k^oU!&FXG-3͸k6"1[$"”$Ʋ. /lKy("u cHGlV$Fqat<$Q.[P|QȻT!r-~jɔ5n)ڻ72*MpS/0M0J4z8'b6V[a/%7e>6Z̑1)"be)ϵ= ݤZj\|謅l˞S?"֤E>' gsSQ7KxoЄC/p :S &oC[/^!ʀk‡Ihb;6[׈S{o'kҝPynջ jk# MR9$Gt"sV^d7|zȯܢ#vVcF}?הּAD`Wpuc Se 5[pzs|=wEEMҲm=%FmS)r:u%9Q!hk5K@} Mv f~[{Eo4$W-zWQ!d{ЅRMFM^z$Q;}Xct͊GX+Ѩ53=!\.x=ּwN6sՇ؝H.  8 ʊ| ;@75x`js7/fL/f:ެǩVWЧy|rE,э.j +/%M1\SFSpe cpIPD?th̄eE\:WĿ **Q\+RUH| GXHQ۾Մ/?=&}@C\RB՞0r;E\A/I(U{:>={nȾuhM {ʂAq &sلUGɔ@= UOoVZQ K@@!~Hܤ4H!$HmJ8Ym'P(꣤OlXSd3)#N DŽiDR;7koJ)()&'U8͗\bq{I=V6DAU]]Qf0{ҌcDu TنHq0cq{IIb+*'2dJʆh_Q0)h= ZU%@d[W!>={n>5f<+i}aW^TCGaQ)ZxoI UQ q^@  yܾ5JlHV>ȳXPdRHP2) >Ffssl#4~ɍ#ZU q$LQDNJh@FH%# e[E$,%k}dZxMkH^a#4K/¼zLP")M[@(5,n-I aj ِdbO1lh i95g_%>O)s9eMH8Q1d>tD%*{%h*/4k`\̾(T!"%3R>Lʙ],ts&[-1֛)Ts`BՆ/)}z9 \ Eԓ59[Q|B!l2)Qvm@'lZA+[<@xMxJO'C&CdG_R 0RSf 5.&?7&_CШN1gE΀jJ,rNHFE!(|NQK!"%!06]F?Na$%(n[Y+HP4Yd{FLᶕmäس:x:xfauG":{Acz]raXbJw\}r!0$}f9BSwajQ@ޓ6rdWY-c!p< @Y3ɇSװ[L&)n<$۰-RS^WNfVtgCY7;{l!:K~aW}Xl+ppަ[#> :W}mB)QkReþς8PMU'S~5bgXξ !fmwPM }@T +$EF๠*#$aVl8{g,rg^r3O6$6YDfx$SL<"QD"qDÙ`tO\DIB;8BPFj./ kl p=pli'8k9GٸΑhBZ;3s 5[ saKɡuvvNfI"‡{(mp=g9 A`ՆJSZBblTrmkf1A ,VɍJ3hd|!}!G%!q(P/]E 2joNK7S!ťT^Feq2q"~x9B jd|etnvD轗ѩ[ZRVю:iQX]"—Դns?t1Xh6N{˳Zof. >ma@ ɅCНOMT'tg_%1jh֕7~&]NW? &zb.T,hxb2q&[+UGdhñ!'c:vgRp9b)A/]U nmJ"'q;cdSX(҆_d:7e+dE D/J:> "du]jY9rp?K5U5W@oXqsO3Gd ٶ?ꦊ[}%re?Xąs\V%_-UK#䃏+ˉ#m .W32f1GaR$( 2Y>"c&}C3IM"t/|:^A=[=QbZ>}X!ĵWa8_@f]it_Z߈3P\Ln}^03*yo3v]%)-Ҕڀ]]!Cz]2GcC7-wҋh B;;8-L ?xJݛ0됍diNɕP$KkݲZkFv/rs6?өJ0XxDW&|z_fun>UI2IW1BWv@:9YSguU6?0JJipfsJc[<Rt]_"\&XIfWWv`TU*diռ}Ic?VݳΆ&+,/p '5@lJJRB)dsVkN@([V oD–l!+e@7nTᚗרsU.(͙R@ "8= R`oR@\WmA+kLtW-H4J ` 8~0hD;Zܓ=0[&3KQNOhtR ҇dsaMy~Pv؁FluJ,y= /@FE yiGvru,{]ު!Y/J. {RV:=uT5vQ TʞhHm$y! 8m/z4ء4[cɼs<{YmS_`Z_<N1yp0);L Vl. 3sMLF#H_ A{ZyOxt(}>p PJîm ixq eu{sU e4&@AA*L4^:cy{zo~yFQg`߾,[ I {V;3 !Kx6 + 9y**k#FDɕZ-_fL1)IZCF_`2 T+蟡i+bLЧdӵ1aW\_~2{kAۅ ஒnӉuSKXJi~<|3C2Tp_}Xپ0lGDkuwsb~Bjx[;Lf%DD;^ʕ @&~X  0}F {{8( Fy8}T(}!f bv͗6ĔAEp#@Ha 4`&a,"ԛVݗR|y;-Y|jPeo#sM8G % ( PHeG$&O/SJ'ʹPP r !wEYl>̿|?A:u,_*js72Sh  SPIP1]݊fUɺ .Nڿ7E;pqz^bKYHC(C jY,{~&|[t#P^'wZAE#-)/X0$2ZwEwV.Ҫ^m!kt[ c47ʰ )O@J-D6x1 J(@kyʥK5hcsr#g-:kN1*tօ(Xh]X+Eѡo r]LFQs@iNb-_429IޑE܆)lP/ :ҊZžDm4 .az1aw#(ўaWvT¤;ybN(Q'F$1Q 8('*F4"{ ;l w)Ҏk1D}a12m7Éab, }Ēe@L+>1Y̴jVTHo-iڑc gļr.:Sr ٷSg6 aA ީTeI7Ęb /wDiY-ϸOo{E|H")3# 390R ]͹E>j}vd Wz;n`G4\B"C++FI8KAN-tB[vw.m?m襤ҝ4"SZI TC 3"4;&Y ö {V%uID ޒ%%tIæ&ێoSyU88l.O>^SRԎ-A[-Z9%ِ&fGژ&(by {pVc{Mv&3ϣ e ܽy4ǛArGR 92Zj¢fF;Qy&ԟz]Md0.~N;Z#EbO]r 7YF;v1l.~e1V9i5rg*.Ez0g61x)g1Mr1=DQEp3m8e)r!Iy|1kQn Edpkqez$値2Q"˭{*γl:Hۮ'#|i⵪ƪEj^(ȎWq7~QQiݚޥf+{VZO'ʹNX>7٪qw*_ VA>vNM3d'jy~V@; zid'QG8#9ő t4Nl%C6(:`OwoQx +tYQݩ n/{^BTݽsܢ7Ұy+v(~pS|`vFSrCZX&XcWH3)3){_^ 4N[tG)1`h B IBQymې=VDK4J Ӝ &)Vn.~R|#ʍhi G8YllK}+O$8Y)\C]V!g 9`ׯ*=QUͼrH#ַ:O)a|nI)gFc6xs[?#"%NkHqW8rzE(Dru3\xÞ/y S>q1I {ii#k>\(5c*滿NE4ISa3\/4Ftoh0Iv^y$CX~K'(:S {Plײq,JO4\.Jad1H3sAx ͠OrU &r0a} Z_V;|nrV|2102ɥХ^䃾/|q&ZHޫKM|߲MYtK?繾ƭp<,gk_ӛ,dfqOeM$Nv;oF8i<7G/N$Vu'j|?*_^HLb1€Aȇc/`8M/FMEZ8cv]CYӻGAPFt5ce"-FQo xaFx^cro%3t#Uo[`ȶ^ ƂsVd bLzb.o)# 4xOzBj}FF3^E`+Lº2X_d--1TJ9n< 0YnXdzm M`zNObD揌r ¦ o'N}+wm*ZPwwT>m 3L2sFzo39UT$Z}T*0Z,W擇?_9 fk|t=1$GƑ*diLXDҜX9qpY0@! "x0 8kGSo|h+a{1T!7P5 ^ &?LE^'V02E&;wzg^*$? > >l|R?MK/Oa·N 7|ҕdE`4,4Lq34,X_00S(z0@U ؓ?M8FWclczVW܎朑;N9ե^$>b|^xz!uנF1<erJ]q'hd.ŭYFܹ#AKFSmz='!'+V %E݂rn_R,/ص#՟Wn -)9CZ sgw(Vó叛l  PhnӬS~ $ɲ}]XOjĺ]1NA ئ Jސv0٭RYWsG !s0tBu,37 !oQ2$Z! h*JsRQ&X?Pa IZ>$F;%gT)pb %qߗ Jw%͍0d{%KG0GxllG*JD$F$3@H예){)PQH#)Lg{Ԕ? d텢ZgV ԒIw(&$wqW%#y1eB"R}X8MZ>TŬ.S6oP2ىde:wjdi㒲d6 dPؑrF]t"mC$W wsyi0R8Xw={Wá.h<"΢eF.dC(ϓ)D)2uB*IJ*@r VqPvH@rTh撂\d TA#JSq#RNF%MeF['sh9Aݟ5 -8*WYZq,) tb ɞx]p#v})!i¡A&Y*OR0L A`x9ABڴ*o=|cd Fd lmy\gS1xO^9UQ3tڏ$.+}a16">10 Ҥ(e wZOۊq%jщM$UIzO8LsHŞ:dn`( . ';H?*x†um!d 镉$sA 9D%q rQI)# [ 8J<Ƶ9rA/_Kd`;5g:¼4nQcOԯ=0H:z~2αc~w&¯Puaf]NN#*2"'HB@@18Q!}/$WFsPx#e H})rsp1@n6nӫ+ݜ> \eAO܉2ŃG웭?z٣u^.Zf52d:SHu761KoЊΔƩ0NIDIb N 2ȳXԊal:?儌]C;V)GV5vJ1*W*3}еIG?IEuZɉW5 HJMe\'dLF`@~i8֝s&ë%QT;ouO7=ࡤ17HO9H'݆$X.c #mkܾDa6r/{]W! ڲ<>yuϸn4Nşlg#z}>s*:$z;Vs'bn`go7UrM1o.>%FFOTi`G)[&]jdSS n ًsI9vr9_v^T?\ms,ܥ^Xޥ_/D|NeW3a7`q4q+$3j QpT#y+ ]eORRi*bR Z#*BEܦq*n9q\=nCZk@󐊄B*F7 {%%u_31kd)(MjH<9YʥS 2Z%Lܶǹ;f+=siqMAm^?5+M)|?9.\'&PGhL8?ˏs1Fm4e'w]Px!<&'Qb2yPχT\".^y=牼RbyEmFL^ ;Rhgk[L.}Ie^ &Ojjan7{D}`=W寇/D. [ R?T5+^)GmA2K;~Btwwj{hOC *#I ^^"ARo y1:x2`R!_066Qr c3'.kY,~qWvKt458Y[k2K.{Ngk @XQ@ &?4.y5dV[+`R{Iޙt .ur県,myDh2/^\& s/3R [O]8ޟRUr#,?'Y5OSDDTM/v7<ےZWi%J Q\s?|B3RbʘJ81wa J-L@LY4uVtNlq=\悊L1I߰ۼ1a7 syQ;)cigM Լ:}oviߞG1NyN"8դDGeI,W^-Ǹmq9e*0o@#v@f.wj.@I:?IYNTK1O]V%xBrڐw;. TdP #8}%! U5tYWTwmP{Ƕp»,Hfӓ|VZIUV,ϲKi-+Ò|6=zs ),+69DWJz^E)\0"ͅIg 'F2 AY 9b!遳L12XVA;L˚9UK%6uxJ =Ǟ.o+<@:%g7Ku<"^J/eH_9,[JO'S&r"zf$f).+D^H(񺂏gIטhcOiM딎j%y#>USoæ :"VWnjOmm1X^RgldkE0Z0) ZEe6yPOO%qL)M)cS~0VŦϯ#m՘(.G}!PBՁ?EP9yϓ?|:϶ɒŪJjw@*z[/-EHR *m)8O)'WM\X 9XLe:-G Do4P/ ̖Ousvͩg Kc(11ʘeLט'i瀊YDu=G;io4Dߕ:hg->zj"jr/[Wth?ǨizowElDi4GeK-y?#PЊ8]#4ZGAYTH Qa# cͭ™/(&q >9),+6Fm kdhB+"3Sv(eEbinaXt+m_M|k{Q^4dֳc?f?ŨF!**z-# 9"dN1WRs_ٗyN<>˞X9z0t3>OgaPˣKp>.u6ju/>}:l80LtjEUz Pg6")dN=Bh0$Xj?g3/E0FwṖ}! f6`M괶5:L_OPG96fV=3nbO"X"*WZpb' a,M3SF5QXJi__Ģi /;+ /B 1F]0ICets yR<9"ALq VGUN2*C}?>F<xc]@p] \9$$ ˧滄d(Fk#<= b8:&bnQ#ohbs)l^FMeWbPO])f͢U#μ>#GF{o<ӿLvk V`;#W \uv?kFİGc!Avуq17Ee#<Y[x= kd ʻ{kDo!陫=INDNB@ ⊿<PKUTsY>r[Vk=Pb Z灬)ӥy*qJb+"-N 2ȳX(52>l2ќ 6 R\4,\Cr {-'Vj:tRa  TsӏB~ibY< (%ȹ62X;]Q<$KՏrDyD±spkCrh5hF~z4= V%x;E-2αfT_#z%BAl &N!豭B %!-/,X ??nEgH29/sT3 p}sҩA™A6S_Eb!!y.G0ur<1}Uvrx`Ksbn52ϾfDzjuQH޻>w|Rn(|2'we(#8k8~~Yv|`/f WO'԰İ|TMԘՔ7l |&4ZI0O_3ħdւ`h*sRjLɦ1wutynwtAwLQIQSDe-foф|U,}*75fYK33_b~nsˇ-мU^GN:`=j|GvJdp\k*2TNFvp,x|m %a۝Am{yf&7EyȻ+9,MQN>_KgInRkc bϣ4ݠ=L"6s2< zhކ%uVyEYZqznloA'ILϹY֑6Oy +I4^L=!__u-w:-Ds3Fdn3 ;w&EOu.R7 %ck"e;eki):쬣Kbc7y|~Yo;0-8tAg&_ޫeeYV}S)0"Lt*q9A ^ ~Pu`0#td<<<֚ۙ$^NOďQ>llG,qT&dZ#.ZnZ7*wֻAy?`]NK$<ٷ:sd5,c3W?L0l7xTg2\3LB?' g([y #2h.I죀Ɩ&8G}.Gț;^*a &(2%@9R qYxS8XְIH, DLV lKY]ᩣ|CvASAQ0\ΈcHxB4xAd-wA; #dP-m.p'.)1M7;D/z, 2bN0!(q["Ty$j|DMұFIu=8B&ZDDV'2{eJt[UUa\K7*l/J9:o%9S2F:jdflúZ|k-=.~Ò^ !!Т,#?@Y= ]- 'Z?4Wp{=m'Ā}Ω1doحˣ{V/.-PK1ua{"N o\".?*x5TD"iDQPKnw͇/JxZsS*sqj LL`etpx#"4"X'z>_([RAg^ͨ@Jav#)N,ïc 4ސk`KbF'ǽ62vp?rϟ@cWZ!.b bhMeQ 0B`uDFC1oJCIuμz(V뵁RSFDp."`3BG(Q t!6O!k(WP:,voDlEPI;uB.%1a€mYv ldiE>VU:j)Q_s*cL%&YDr8"VFr/i@Jd)ƅ눐U|i̚nn{fiQq6(|Ԧ C i 酽{YLkgsCi=#K\XYB&Zꤦ䤘ҪAW(0=7;eCCS7GV"02\Is6M-\&Û5ӧJݤFI0| y{]nT!"Oj)Ot̥d*?p/R{"sdJdBW] d1}'x%s8C'Du|Zy]"bn *X,?E!Jl"3Z+G' H M} XM{FsIT֚wǁ?g%;1{t 8@U&\<$4"GMëPXV GhR#썲Ѥ䢣N^oyɨJ k81бMBR"cO *sJ|DuOmZ Wt jP@RۖNru|$()mS$lP!~1T΀V+߂*>J] LDmf/]kq,JBFaQ"2ӂe1? S'66OܺQ(ռioQ+!2I_E:"bpbxHB!-ƛ |xO8(Y߶Ybp<Ң"J$M_ (T k q %Y"&]-[ < y)H$I=7YF(ODVb [( -7l批&paYw8ot;=FB0pZnCZ +g+.[ T-q+UZ:A+uJ|HQ7-uZi *qK3uI޵1_W+Z\<aEIË +$q0%2zw {ʽzʷr'z Z(V_ح%5WS'|#K\(_[>U}41`Bu &Ad%),.7A":tq!$ɗVYKoB^jIb~;LuKۊ;9-FMd]6L×7F%?A6 #[K)pd]|8}'W$~[̃TQY"b4|,*Pcz ȸwiAd&,)cɯu7M"{joc9}KtJU;9z5]6iJmއIyB}(/w[Pz/>!@ C75amdۤ>۾)N"oY}Vi4]T UAQ(⚨X.qweȸ D+:vX' ̆Cbb~ݰqEps3qG D>'IhgcR4 ;!11٥Q"9CS4œFZ{.&][hs.k9^w\vL* Jd\hRujfu-ΆG:S]R^gCl7 VsejT #G^":ݽYfk/[@5:?ԫRyx@ Pm%2q21 EM{zLZ@Z[+x0pMD#DیH9RaQ/rbFoťDFT\/:3, |M;夙nɧZ󃳑Vr4_ϟ}V:C y%O' l8+6چpi ".ki昋0D8^_ߧA1]t3x-mm4p5WEê%-:/P(!Rk k7 }(}N<}>9y~Οov)W}?a}qчV+VꯋZ-a|ۿ )_~>k`04l *&cSddMk\}ZL0,ia]u/ opΫ"5{㏾mzH(-\i0m٤"S 7v>+`:|Yq>\5v[t&FD9GS\w~+.JSA t^`p ;VOx߶ \$3?jЗJ0^l/Y8غ9dl|]r5(UR[wu⼑Vy^ԩQ6mfK}\ 0][.)% NݻoPʛ^{Eap%/[߃YYo ?G}F\7v_i?}OqJ&*1ʐ緰ZWEЫӂQG1J:a$H[pm:v/Y_F\=AeJ'~=(Q˥;ϖw)Pm3J+ڨXæfDf0ZK!{#6`:"[&\U;}Wy(SK \MHTܠ׺qm\M:)I2`SY&ZDF9%AcU0aΦm:2M z\նB(:JdA1 *MQi|0M4[4eU[>3ŶvX.Qf/w7RLEvA d^V_>3U,'@9JtlDic3f_wDCcjY"㶚mrEx/6ܶD-eQ\j+e]vLEg͑&+{ TGX&V*c%T:GID<%2zш'SNW8@ȸ.htK_*7#*z["6 h ~+Q7MzBWTsMfNsU8NHXx- 3, R8C%pet"enJPGjjKYG؝ x%Q92bN0!%2T  Sٳ`^s_QOCn H}NFʰ7ȲNˏ5:&e:nxmarL(}+ݡ"Q_b0et LL[⭎bǕV62x"ߑ4 Cjz Na|B=}ʮ06/3 Jd Z/B ACpZNY@y% /aDE7e;1=PCӣQvXkpmLUQ {vyv- _|(PmlKdm6Օae4`أ.q ˥QZ$ڃI%x<g64rV ?Nwmdx'n2BI}=CzJqmUU~欭WgKYym7lD\6~6;1〇P8b-!wUr`m]=65|WR+_/oqt!/"d[nyxJ&k3fQ\NJH8.|FRד(a7۪ _RAr\O\Qާ99ɡޅz떭Uj/paQPcwɎ"j"}X"[*; nXTWyu+kAUS'椟˯R{5l_܊$ U(qITo $-kb%0UF=kO$Gŗ[k?yN+Zg[pCc_d@T0L# TdFDF;H8=RRO@TQU^̸q'NLJKY GN5EW L_HF}4k`|}4/uI^maf5rx^^IO]Axłhxj8]MJk`Ak8p>x quӡZFEѯLoVuߡ'G.'ŨZ M;q1},P.^0E6@0 9J fW>vT/i(\y6EZz&3Ay!Wd{2XnBYܝULThS_-x| dV-ջ*o$%QuBX?.nr}F{n6C;u:&V^^Qm5݄`mCw2Qbӳ9 n7qS iAg헗zC'ř Z>Y|Z23:`G^'\l`tw7_s`dGX ȒgMxmU~^sr|\#MLRIAΈ``2@&W q {_AF|Jo3Yxϐq9 iz:AV8PFX1,_)7G?U])(feoP q_HRz:nwq:=0/M?]M2nc]a>vɴewﰕwWa~jQM6L1Ɠ(J&,nra߼tB!;KfC> hQW[辶M_wuj:x̞ }`~Xzr>`J?+ȡcS[M{ߧO -Z?ؓ_ObŒU{> ܊ ۶Vޔꬃ!T̄`@pHT ٱ&RnۄHƪۄ]{xBfodU{nM&|\g#MhB@[laa>P2ٝE {}s^qZQTÁ΋qQƻNF$ueI%%sFfg!PvXyXzF!hU06{8r~Eg6㋞iS֡%}EÈɐ'50ny*Nf[)#@ _./tU4|ZAnK ōdz$K6Xg\J V2=Z+[{ɵ8swQJzMN/}1)}j[ϷN^Rpm:)[>mBq;8 B-jTT[s38Fm%SmQR,HȆ~1+8(e *O@ \°;$cڢm=rQ}nԠ  _.zwwJ O<`ʽEKR֮C0m΁rzNOb/ҝU5-̸}%,Z|`3Wjn@`?Fw6)\[4`g/Jp*ӵyM40(opWՋr3j fڬ?_c\kR|Kap;r(u0T)I;oMF36x'#ϴDܲ2LnŔ7T -;SFC fNJU_~n.QZ뮈+7oᏝ)]=$O"2[cIOMJ1ټ/'m$܍Q|0nEGƭe3 bB?BtĄb6ԝ f=imd}DM|d|5e.V F&HUo|{6i÷$22 )YwyI0yIxԑ3#h Ҝ&9*J\“K&P*(ڛH37N:Wr!9U9yJMᓬ5Z-0%Ecwv,tS]I}@ вBΪb>ZT  AKgt%)S`vܢD rJ{k跲%);m5 -jϳQ{3f44Nm#@8l=`/(H CP\dzƃ3\~9UE8K~D^""9Lsdy B _IMym1 /Y @x`dҔPp$)xq; &43˒NU4mm?_ }{_h4jW;i4$٥Tk۳nNF]oAJF*!hڦ[dAF˾ ֮Mחz S$b:PsꄎI j[|t8b쭽.§NlycI:XWR ŕb^-%$9ZSjrZ$C[ o#t@:HHFb&;- Hy;3 G%An53Ε/,l? W"4YEڻS O7JerϏm>a@wXMuWl0=nD[y.)HFEPV+ՠ[qB$TXb`('*׳Aś3~>:dz!KK#)Dk_M %glvtsZ6X O܂(EwIg))HZ:ͭ0l֨5X =,[lHk} "'g[D$E2b$xqqOC:1YIwK"܅c#V M`mp(RÚ$eN*)-BIP^+CmI),Dj`GV O>'z^R0*mՁy>Ao9xX4k$VHFP!(/(TX) D%Q@\_.s9 *PPJU}H̊ A"9#5bE3uFQ !!IqOkA"RR㎥R<$!9 |Bi猴I*WYd>q@=Y:^l4r*Ż$j^F{%P.v8O\HUVK+u9j @XXwpiC)k_7ܕm1ViVuˁzeAwPTU,(1(2%ϘR AsMx)jU󙾸.m\G,11emXf!50Ӣ,qt<[.32GϫTp%'p(xFS"th.41HR5 #4rb .~rZ yyx(Oau? yBu775E.e f6BQ]hd6hz6L~x翏nD;?_?4J']\.noW<ػd{Eߝf{2@+,5{j"of҈7s+NO>(ӓ2=:=z+d,?ip3iO8<:UT Ziz+Vn Hi~eIލƺx7]PP!` *'f/!K-^]rSVM=D$? *O}#T$s%^Su1!:Skě|`ԯ䢹QEr:i$L@ s tCFMZ%Z&D2QjMZ\jw,)bӅe,eٻ޸n$W =MEx1A, AfOxd]nJ߷ؒHGnY$7VHVQ w8"e$ k*{ǫ%Ju|:=?=e޶#%t/K B8 *F⵫19Ӵy@GF׋a=(ݣ.mPsIɲ:=%{zmZ6"{Jd7^mRSL~{+n "y@E[;F֏jL1EJ \f@QRQ64ob%Yn>[bb2cLˍƌƜmߣz}~ϰ$*Ӭޮţ/ B5 9@ىLʔxr7liBa8.LSGͧ y;!GH O^`JlRfSLŚxtjo׵29 _JVd薧1[~z-YҤ#DY^zҞ 97J&\ σ娃^?7qM]dߚaN"7XtװO#l{3}}iڅb2 QEtqp^cA+;^7ƌkآUCf Uk,Ӝ v>lWk<%١a΀EEb#N.}˯+.zE_4:Sz|x>a/]D?'C>܀D;n >ͧBL TT"K PjbahT }+ٛNO'vwSeʷD+pvxn;[#K1T| 5#g?.ivTx7jk[kl~ tכ8XcfkLL;wd~o~I] ԓT5l4 t!'TVbc=mF4m4:]6ï@m Nm;ŭZHijCwx֟y%1d>z*٥X?10Rtw~~#mK3t̷dq8ތz:یnlu7.IgcuXՓxsx/풫i3H5[޼¾p `59_u}\qL auy JPB<\w)k0!7NΧ˺_zDgO"hX)$8pUF^F Wr4!T *eIMڂQe- Q m .@> ke"h8,POde4ָ$F DF JR%й+dg@d-&֊w9Qx:cDKZCM/vii}%|ޒ'EAF6g9ouylv 妭 E K:>psfɅ4.Ȁ3:jE[! (/%Q:¬fJ]N_@G:sWEuf$m>"OyZ*۴0჈Z9KK,ZMpN[ob%-MeL?JF+#LJf A֠09k(^4i%fhR `g3ϦuNZ9rK% S\ Z9°+"1B1IǒS^/WֱZzxi_fUڵ(ުV[ư+Iio ]")KVŶ;6г*\@ F/x9E(cس*=̽1M"*zW!2Sj(XtA:[,OjB11(/c dQtRh \VoLOiO&Uoъ✌Cb:e8k) 'AB)$ygzJr_1򲘇JK1@{ â~㶓}I2,YGvc%bcKNQS8UCV *cSA $d|ՃO}هvՓO}[# (>Gt"[ 4#&hdbGG*&(!cfJJ)X)*✽Ժ(WaTF;bc;XkdTt#W7R_4vF_;cEV(Qu @xe!4 B9\u rx$B"[b]e7/No_`#!?~0F?oX^ZsGȝeal ѺYyClp+rkRxGc|8GоZSMWkpd{, 0~*49bYUd.|"hnK^Zu 3{mi4{^[jFDZ3y#Se!93n{ ٹ8VNMYr0] 67O6̭{`Ө눭GY!~363{+1Cgsȯ^a*9;vaALs龧Vu:$VoRA.?'ha4 F1G`<9\eFX> "7{*N 9 aEFd nv[ 0!w`lU@v4=O'875 iC\˓%T}/_PN?B*Ƥ:wk7IPlY b,@gxޯc4Xk,~0{XBHk\M?;oxUgvG~*FCgš;I|Bt89&:H?9 3XaŅ5Fၕf 3SQTTFk(kS(ycqV׉ [lFdґC_ \t3lhzG8e/5u&Y!0R,ռ':8 ݜdN9iś F}uY)3dL%--9T_:㔁`2CQt:#sAVe%p>T"ٮΏg ;F%;<@$C2`eAIeSsz̾ŐME߶rix;KNf"Zat ]u!:bğ[Z>:N={vކ$*vŤ4 zӲ/!ul 'ﻒ\WRտ8IJmd`"յ"Q>j}GêWםP"@,l*zwFZ0pyoXZ)1q" B7cBlrnKL.2s6Ub-Kc瑛,_}HԧX F8E+7,P ?~$Ǝ:Ji[Rd\0=):y7uY zdƌ-٭hiz]UbS!U[oaJ$PShJr䐦I9`MF~KD9C-s6DJT+kSk#g0zy>>`i#fb7oz +V#$%RC Y(pҊwAcH:esDP!g5WFЬ^F:1~])KOf/-Aq~vX pȾb҈`[EfbkYq9o؋W`Z ^SKzI{iSoN߷=WFR\ȼ?+b}}YJ-dٚRɄ@bD ٞu 2{tՖFl(6wP\EH\Έ\b&;m}#*M1|OVt"TB(&;3HnX71Ub($jLM-Y#^{r4FLvaqU5ź9tj6\=yWa>ںAzvٺRfҕP׶kz w΍`>[>"omݦhuzǯ4DG; .S7蘞k ]E&_|7=E4cF˜NZcWڊdTWX/Xlm(Z *\iA-T} == ]IEnR>aY0z~Q~j!͋Գ*C-l7DQv싛/XIVÌjU*m^F(@ؖC\ɫ U|kRe|xx "Ppح;?1?]-5uџ=_ wl#upY&RHwݞ1o<]7Jd[Qa> ],PGjDKBCrc5UO.}(6KpA FR Q@W7`s΄Tg߮hfbEXeqʢIZQS%uՒPj>HSׯ2U]k` 1DqN.}vѽ/ ,RmEY j 4k"&QU\m] M%,JUGV5O)b$v b_S9.M)퇧6&v1:7&^M< y CkҞ/qԑľpQi7VgAlT(xXfZ 垴B9 lq"OWR\'m{>E=;ϡVȄP3v1osҨu {Zʽ+YGVa'S!uhaNx"FmAAXv "5D͏ ?q1FN_a#uaEQ:2ƔIv3PN2p[<)? u?-Vu[9#SCWMquKqovr7ţ\_pB1un#Z7b\JT+,XT{rFJ`jQuڔ٧|yBbwo0o޵Ni0#{\]㬓#" =x\qկ8(Ks?Ooj?OtĊkVEtѹAEy?<Bhx[LdVπ p&S&^ *hC瓓)tP/1LE?~%a6TUʠ2yJ+bЫ`uhMpiS2yڣ庹sD(9 E8m GۑU(w9ɼy y 9Cu6RUڛD-}v2oc_r +0{q>\\yw?_=??9{S} 5Wo[:>?_y9[h}^q/^~q6Pf|&z3A`)7ZߋA~QK8xŝay2[K͋[}cyMON\/_뢹ÿl01(  br9ʒ؅bGUqި)˵#JPzˉ(]+Ԕ]9Y T H1BM#SGgtg&bA"~0"@ވqwpNXl,VN#f;`4TkK36UI,ƤҺ-=gH[G|K)@R* "d괇;^}D AF/zs`?D묉2q}Ŏz\Rk'^ӷؤ_?`O?eb1`h2zGmj YZkGUX,kH0CTlQ$5JÝa9Xq#;ul̅F枬ٱG b+ԀN%C&nlS4E)K}A`!X6@0L 7-~ @&v%uDd)?. ̄S''{&UfbiIjډG}b]qub0ab3Nd.*nťŝv }+hZUj*v❊a*gUFKJ/5^E΄gWrq }**@ HH93K7N,n#' s}i~km%WBtUFmyHR20*w`[ 2Ȯu--7TIX 筙hԛb*3Cg*VFun[s9=z4=dPd}d0Pv{ي`?6/FDPSDT'r$WܑGUoi&^gW]SId&ήΧ^#aSW[K?M Tv|xZkc@&jq5K.Pjc=.0=fW_f:W6.e|͹Xԓ/o\/ePٯ3eL [Z{=u][W[Ӗ˫?dSm9{~.&}Ѷ7[H|~A!q5q Da^{s ~Tn=pN^;׻*\T*e% UuzC`Z`~jEOe.aEyO#uh[gu >7&n Gٸh9<*"wBƇ6~xѺO xDFt|GgS:ֲY>, 8'44\a? .\ JM@Op1c iMRHvZ ƕKAh,b6'7l$Ns Q-CaS%6QtFUlK &Y[` Kcv6Vjg5' YU-s!U٢`z"okBSZMOj 1cl?/K+]6jF0#U:VޙB3k8L> YzMVI3@s mr{39+O='VE)s\t8gd.wvqa8WE3nq6hS^eH\gR=y{~X`+qOi_"T_"ںIdzC Wܫe`tj M7:/=Puj)7t|Myʮ4'nq%p)O?}^##mߠG6Y3s`d+D&GJh5Jv\TzT).DF%Bv;Nc[!k)Hȇ\f;1Tkb3y6jI!徍m¬165egH-hw=XAECm`[WB}Gn:T,jϢX,xZ 5kWRƫUNb5&bWT/&:_s#eC j6{zJPjm],|& zN g}mimGև1O|_s@WD|݊qo@t| ֶ*I~yVǷ}ԧUЛzggoϟU77u9U!UX ^6cZ&wݻKrP Ѩx͓+P6ֲƶ{U)HS-8MfȨRev`;RHM)(=wщ#G-o8sQ :&Ԫ? dGM#t)Ƨ㏸Py$aC`/2#f=ϩx}Bd M%S975%sBj`-$рLV?l9;X{ed+T8ogCSN{, M0ɻ`"O^EJz}D2>_94Y,U&F:ExFwe]J[XŔ3X/rbQ]R!֞LxJGW:±fQѲvoOr]n*^A$4_YKmcg%lFH5 VO.ؒkRh ԫt)#Bx*at3aeG-m:p ܿ-pߵb=lŊu~ gVy3YyA{%*pwڜEh3pG`S(  Fnɜ*cst6'}PX}%qS\Qȅ[H]~04鰇E"ǥX638߼>=Zp|(Ʃ <5ypr.ѡ 1%uE B&9)qOzG-ު JncM&sB8 5l9dCag[1\e2C2¾>BL!@ou-Gz%Z¶bsѢJT7$\1eSkXsQVddv|/NQ之xJExԌ䙝<[@Xvo<:S 7-~O{\nLܮf߭W/옐0+b9CZ(gǃv9F}`>}a$`}G]1` g R*jC?[nueZ(\wj:u" kR;^a;7:~ԏhw"2،5_m峿]K~V烦#{]+?},-7z97/r烳.51vL^ה[eux@gPUh6C*sO)28s 0 FRO0r_w~7Y#\k $! :#\KP%bґl:бcX~Ū-~ޅ,ȢC-k뇠M&F_[()8Q47 E$o ~uk[ oO1\;Ll^k[E> KD%rc|g'Ya|5 2Iۧ^ {jo*-CxE-4aIfI Ox M677%P4T؝:M,AԀCS@2!qh"-&P89'Ee[Y]eM-V.~ҩmy(t-t Nӕh(Ll`̐/\I-HK "ڪ K"ǹUENI"eɧ8xvq+Zj%x^1_%jSxBd#M>ϡs7c3vأ%nOCfłe5pLJ&ɼ !|zzxy1@ΨJ$a$j?!w(>(Jd-#i"fJ MeNpi=3֧8V/OXЀOf`XDFpM(^xY'+9ԲKkӘs#:[˒@F7 IQ,~Uhٍ~6$VO?CWu|ӎ\3M3oypS\Tsxҳ>픴auOVLeAX-K_a3 k>=P#@)C23Zrb.SGF+ Q=PԛʺVM^~NHV9- EOu鹥OOO|'V*|R.ݪGS2]FW+'lRF5;H+Vnn>xwj>HP6A;Ј>yQEᩎPӄA;h-Oٌ\~>Wg#'fK0Iadoi;v:yҎA YC)f7"#mëL3O{2? bLqSɧdk ƙ=O׃˛)t~6ȓ1LpFsO\iJXhĺoe܀T\@= h ͳn׏+^J/-#sw;8#mj[J`(H+sKa z0``.] P;ߜgl죴 k=-՚m\&y3e//+]:j@75^k0N+/5Q~;Q﫯gQCLe-8h AGӪ8OL J-bf@Ȏb$!㣰 ]vshxhWRvgY}Bj,\@ bVs=ڵhW]0_-w;7j?{naU hU3lJh'kb Iq[̕o6fQGIsJ|=K4Z"becl*F;|w6-."}\1e`%ClJTۊϔ{6v6ɳ&L2辴v؁rxw}*o]?ZPbZ%Nb#{&p!*3q묫wz rUԛC37YHY#mEL9 cv7߾MQ5F9UC|=SGa9m)qsdTkâii1 ތ02 zd),2 'D$>y6Hߋ1o:&qȒfBZf>̔,. - A'RhRdljx锕wxAQ'僗)w"m}݊vȎie1U@+ڤJѪM=wMIqTe7>Pd8Wܣ-ڻ|jӁBX-gS;5QC@@},$ |D8_rm%I,Lr ه K1̵Q]@)}v[ϰgt3FI*n ݺF KՉ[4{=>AY̗2\:Cj̔ F$Jm٪;20Q,kfB!B!eF(Lom^+yE :i0|q.֑y 73o9W3<Zg mM|( QO8R\dQ=d1tE=kb@hin7VEnw [1jEJ>x'FaY H]p=%JZlf]AYO2$r5t ^]k\e^4 1qFn YYt%IMh3DkPpi)oA)'%B`E5ײfC P|9rwj gp:-9 Ӏ'\@v;9#%X A14JJFJn\ fEZ#tE%N,BmY2nsE)L2*9 ʒ:@kZ`]Zcu<ɔ3'_!Ew@;jÝ<03ƁghU$S 9[B V(prdg"8-* L2HAidmd kϱx:Ib %;D{s1~b6@Čm,-ISRbiMB[p.YUV FVO\Q|%Q5އ!t:Ѳ:ضg5HoF}x(Wbt66&'U/{wTY qKnGߙw F#'s~[A$jwD׫'Sf'aR9̾Gߞ ƿ[ÿGY?';qNk"\burr,"h}.w5!ٝ:5[p^qIH*o۵:55JаTOi|Oy989wͪGc [_BAg3hHO4IY9ºr=N$jI4Z.GK*x3`֔ DWD'-1\Pb ?HDMft@쟃}:tD$A{Q$$Xe3<R"Ie[)qڣE\uv!b*Hj2'_\Yozu&8mM%if$yZGѲd/ku #m꘎HPu̴#ճ28D;W(MQ:ZrE/5ΖZG;h U̶񊵑`^x'7T( |Haʞ#b?Y^k V.Kko$K"jaFG7 };,䐳zQ7u6K+IQpxTISuZ(BRV2Av1- 8T!e(%*uS\1xvك臏(٪O?~/!=aUyMTVRzHa}`g?ʀVM,xTe O *ÄоRUR“Yъ -HRU1U_uD *mQt):8z*9W/>Efyǫ?9ZRQ¸ɔ#O+^[:*v l.P$Aw*=>d#әxk(Qv3pvrN6Z/}ьunZBY;˦c8]H2kiKyPI(!z׻p#:s&pN1.6m֠[ =l!jlmiَ"!mw3&+5*B ;w8.r7dГXg68dK-اm Vsij>#e9֜v>k2_iۘ2S @\ҁx 1x x6daK?.H$o"$ mZz0εI+THBTY9:g^͛7x8P*_N8&w, spG+5=!d=5NUT}sV{wN&J MQ9}"!(!}<*5E=4s:`@a*^>۰ Hͻwoݜ:/lᄏJ&mXn}A\n.ߝ%KN[;eŘq*ZS?c-L?K*G#KғPxکnHneVSqݿ'_\J}}_rYq y"Lw㪼:SmǔŅ϶mހŭ&6eiU]]c5 sz~[@zI[;Z^z9};o>4I3|#q᮸ǟӗGM<:lլ[bğ: O%=ժI\!n :F97F~NOF`ՌqMHF闄m˹(3yPs/bHJl|x\`#-,Ѽ}+*L^{ezjv/؍7oJxMR/aM;XmT`u^O_g# :`rLgLRû[edpHN哃)V8yu(`[_zbf7K{lQY ?Sag^ 7y,yO>?ˣ_ j5 5~,ֿ$,.912ZVV+%뢱|fihs,s[ %h60U4 <89W51➣ՈW^_ܗGpF=rtޢ!1sEedڦp"9x!.v܄i _+ 'j?̗>HaF)s½;-A05ZS*,$ަ!:f*k(*C0[?h_Lٻ7? àNWx'0櫩Vcæ_f=EoO۲|g[s&d2ɿL&[3qWndΙ,r~v:~5C\rҹbSoo X XȌ92>el2U D.{1T5|-֛, 5.:z CRہN=x˳}f$kwAٚR*|aJT$2ɡ5d*XkD01fotg2*mIk4kkR) 4~D`e"oGEICL,)9[m+"Y+lIF]B4.U%#/*b,@gԹ3'ķW س޼!_qp~'՜X>A{; ٭;O<+^48`:d쿒Md3M.˞p e淯IW7ЉEa 6;>[7lX u[Jv##0`Nő/z :Yé ߣ0d!*Vs'CkN`T0(2(Bp"fŰ .9OVUNKh&P(s$A!v M@) @k6u3C5V积W^dp&r D罢"ٖk`O쉑xr}ih u9*ScU`JdFFZ*0q2f4e۰V9ab o߾ ÊVe AbZY<*k1^P..`ɭ6^*ˌQSs`N`ʔB8ĺcU, q:frʐ7#> vŃ+U=f"x*ʊ%bTπ-K$e"SYbeddW]"RqU;wl?\^*ΛMPfPjC!',+M]ΆHQ&2ZJ%6: U! ]̊)W w$]!N!"HcSfaWA:_*g6+f8 J dep"HjUVؤER,l|-*ڄ ,V;)/kvBNg*R=;I"To)T9|f,?FGhBf f"N耊%5q5J#sR)!̔iΩM'ФH?ݩz|Ď.EGk#va2{8hVl G֕Z$&hɘTF'E1 C_1gn _i4RdZoy٤L# ]MZ6rה隌W"/HN+AzЭrQ_&SԀ) m ;?9N ؒN#)mx#qcn1mFJ0gcXˍ'HVoW&cuNj9|DNԈONh|mcHH󆯤FT@% ~$%^_BbCz*k_/nK?iٺ: ɒAz+UUDIT@ξB|1kn弜Sa?gpss du&S/ٲSһcD-67"ˏeIU6Me]'A[d-ԭ6Kfxvhջ_ޫW?lw/нm^w7|uGnx%<Л[w1L%p莳Cэ*Rw!i􇻓/cMӟha[껇YGRN<>1˹'|W+kOӎ:!chUkSd;}"2 *$390m1q]һ g*_8E=^GuB2L+|{Vo>X`HIX+0|wȼq<厔sCc%,HcMS8ί=y].`/&v?ۿ^ܼK)ālPXG-_H]6[#'M$QI08vP|M+5)6qD9DS6)aD]-Z( Hk,&[.RvfS%Cv Vh<QJkc1At$ٻ8r$W,H+d0 â;}h 0$3H>zfڮSRrрKRVf2E0ّSe%>X߮Jf cak͐c6b6pP8mi0-f3yϯ؛r.~j.H;w][1'([U0ړI8tH{pܾY̬~ ߍ Јi]_ŸR:b';M+ 8NƸ/nsQ65ps%;V@p;Ld7Ղ88jU;/0%wB 7}XTbv3ל|6U)A//Zt$mxw,R'i}UǫIšc2?ZnX{A9 h5NDkO4qתmDu;wkٓce]WWթUcng7X+`G\+Z@ZjFHpfIowIU2Zf 5O_7;&0%,}x;Ln?(_KD_P"ԭqj]CzT}w窑 On[{~LgOh~lLt¾ŧf?R+CrG7N9Ft<'5Ю<Ox!n}L(! NRqs@fubsGpv;ȔHd:ע9R|ftISg0Lm= S0! <R8)P=X䔤E?d̽2ʸٻ,4WUٽvpѫ~~uWڞ&_ړ:wlի% a]h':'TsWPl&P,Ȥ +M*2`8C8 }ȶRѦogmA&-ڏ㧼h'nutbw+66ɖF ^ړVZ|%뫞Kym8K_zs᧙'F'<VKoz lo.B$oZq.3O8j=J8:./rnRZe sŖ-*OqG%Z˶1ŰpWÉͲqA+8CϽY3˧N( ݻױ,;a Cn9w %\|pK$}lh?2 Hap0h(:Q I*j]:tGLc=] g%PG^&Cv/d7s؀!N{l%HeH32pěs^~L$oyw]dH)U|]a@h肖lٟsb!PIR_-˺BP>G[s:VwΫ欻e~ޥ*Jw{/sƖFz61 CZ):|iSλw~={s{0d$m3=^F[8Bq5oD:'."Mb/C'WCy6hL8M`;]:,vovE\zN'jC{;ì=y{BsW|'BU%#(;3oy sb\1:]5p{+u`a]y_ y"fB16͓5; 7bl/=V5I9qBCܞT帹8=7Y+K|t-n⶷:ڗ- UÒ_nHtx~]G0vUY4t~Օk ]=;㵝vcCӟ~z>ϼ7WWgg9_K_M7n̦l4x9鲳Lݥiη_lL3rqe)}5׭zNl;ϯfyotMߚ,_g^ܨcK*c38ϮGgeHsq dlU[&AY2n+5 6'd;|~hҞ/K}1jAia@e l.$NZtRRڣ8{;/r-NqN1zYÕzJ1Wo).04 ѨDm6\mvϗՁZs%i1 TNڛaYA2}r6O{Oi_qUpqMݷf}8D|4Qжo>:T{b.;0ѹ2RS)^)m-V%*uRpտy$!m۳%ыK?8L"@ I/ )gsaUa1ۼKBo7a`E!t{uxܤ@áXd"nCT/8fZSdw7ۅo o5"m!24K|T>b#&[B"۶Ӱ9 7"fadپG-%o fEcͭZF9A("6G&O#vN] V&W/˅V#qmޚ.T|]AHknܥי1:`j_Fی|0Wj>j.C0I)9:jgAUXVuh q,ԃ8augtaM@[Ic7"B/~sm_h/׌$I8r$>EE`3 QPӟ֓CKz{vvP? :Vu?j(38ixΊO/mc0χGrq_x, /9x HA8{"`Sr93nQ8&Ѿ e+A5B|̮揳|~vj2Z]4t~`s Ϟ9ׇT'(AQFۆX# ʆ.f4e3znyBM!%#mWPα0ZF›hk#v.Bc"vTG$Њ7bwi l&ؙ:nK(W[)`Oj x:n{XjK1DZKc1P N)nR"(ۂ0 g{iJ. J;XVjo%Cv3Ĝrdb.f1> C@ >Sqè]d-eO ;ԨAɾ6bu]zlg_l| ^h nY=Pz#ԮvGSv]w=9UhU&6UpTSUm9iUn';v@>ә#hͲĶIŶ=2{hrxD& "YߛALkjh B̬F[D9a*6 WP.xZBB#v83z=*bg#*5gch'݀|ސ=!vknIVjijHIu׭C@S>t^WgQ;6n yunB)_Yi֨^f)o+'6%>+i쵭]A.$Wi=}'Yzyrz_[}Z_*k jES]M*&5901Gqc44.{\(!U k / B+Ѹ VCvoÅUrq7GdYA9 BW#vح7n1pQj|ʝ;ІBL&'6~[]vՃ{C4*-RX^ yМqjR`^~:ǫo_*db>MOha8.t67Mgs1 _M'xMAtL<0EȬ⼽ 's H>At>q K<";?e8S9[5DTL%94b(8PFL)Yţ'\pU|'DVs+rQ93v8J?gúz|(1'B+|s1 'O ֕x}vw' 4&'dAJp*XwZ滼Wau[dH1Niaژ ]ߤ%A]Gx.>zXvJ#p;*}d"|/U$D/ZP~ ޮj=C.O,1;** R 4𺤦R:EF%˃Z;/^ZB36ֽ7Z}@ԭL/zx(0P +)(/ע:^4&1I>a|-oMYWYQZ șsb'z:0d% #2ML-U|,gGF&ux:xt_\g?7@,NFsnh6sk_ۃ 18bUL-(v?j4q İt(S m 3:K-vbaHe4 ֊o$tKק k~AbBDs?iL&!tp%PK7 %$H=ЖQXd,"FIg,۸ {=H ",ھ<76aB]/8 a=HPzrP8BKBˎM8ʏ͍)|~Rb jmh{h2?eK(gnF~-^~B gkbCףvx0nG|mP0cP₃q0H)OQp$)[R#%c IڪmP0!2Տ.I4E-]a썂׳puy}R.vQ*$ @GXлܠDm ,Y1MtCc3 4a4)`ϴSx*K`D"βi`,ޓ&bNf ZP3 -.l4^N8{`6ΞF;GnD FSk yA|U,/JS7;GP1La.a/Z@tߢwmuKz`gaoE9P߻"ކg)`'Mu x%m) &ⴉFL(`I`NYA\2:MI ۶b ץ+njwf֛ggس?X͇>O.BR sF6 cɭ}o&PN¬f~7)#g&A))zB st*h|)\5-[8h-վI>On 1r%<]d,Ceȁ@G]L ƈ!1CBD, %J(L$sU /Ih"ceh5@dB#^Zr)U8c$8"RSR,):Qކc rƉNNA4NjΚK6^V<4FR!JJ ɄHX#,JSe&MFEu,I!xrTm܎x>msmEUتeXMdX \Ҕ4֜YQ.duNE CvM m opyͰeJʒKq#F\.q]aW尿11J]a % gI#DXTLFXlIPR=4k(Kͽkw8,S3hwGQlwtPglZ˵61CN&V-GVgBӯ~}zKaa«oț׫ꈼx+W >[Wx.C 0C":}PyXc0,/){(~g,U;%hMpwn+- '~Hc SNWt| is' 딻Sw㭘2PnۼE9'9ejιz&sxsUGݽFعF|g?!I-J4O2E#lX dH!Aa#] 9A2)92SP# -ī҅P!R_*T)RK@.`N*IhQKױFi PPRsădA#N&"&t*婲(ag$3PRb*#$5 u&Ng͍!b "n;#sœtyG*QUӔ+uJaґf[#M g Dm9ZHS\Ez{"9zt<$@` Pe@ -@(n}nSI b"$9 (|TVIb]^>]L$-IkLK8R Lx-'<(x'sC4b <:M.gT/7z ߿UJA!ϋ2pQǠ.KƗu"1E@O|g_0Č3r |=jy)/j3إ?xnEBPvj56+Gٶ=l~Cl+G3C5ȈԒ^qEL( ='{mxGMg7I\ҪU$i"g]@`ٴ wSԽSV8W3 xIyl6V'C ,*̓\ǓjAp3 n;VHgc<ſPi'iDhEiNKNBJf,.+ B*䆒P\[s6h|}wlM0ujAzeklц5Q]U, ]yOe؞_ɁݞvcYБOa]δLq +gltg9,?T_N!FsFa)F_(fS,?4˝8lk]xj17-8CnAHJx-u@xK`MsuI;ƙyg3DW.Q$qv;{m\ZQeHYQwoB^b,〛KZfkqw.8*'3'yN3ٲux@!?r7RZ뛽J'3^>7{z;^-c ;j*QRs6_;Bz*]<Ln<0Ruj܋o c䁬87 dLi=,O4sF`}<ذtM`rRTLT*EVW)*¡Hت29O3'i`_WzFݍU}Wo&LW2$::t"2AVѬS d]Ntnb0} >t޽su;ӥXS]^bvNb+;I1[rtޕyP$^vԗ)oț׫>Ga Q}xF|qU?{ܶzN[SINo2sx@Lڒ*NL )٠(ɲÙL,Q$<],0_ N̥'g~/s{WpE2H"brtGۄVч>/rr \?!=sn4œqg'>,ñ8b(yqa^)Ee\K=CJR{w:!+y%J&5LZ)_w ER5o&i r~7չo爣]@ @"[]kv6gR6m"qٞتK{fzf]_ky<mi#+d: 9)l-K-Ox*ސrCz%"ʻYKciB{XO7f],YQ>\Y*%wQv5̭obؙܕáWN:[oNؖnzh1CP̴)*ed[aTz/f[CS^|isHy4vbL8M#ؙޘ/N!أx?7j{Cg?׽Ӊ5{ٶnϺOƪTхEn9d>=gn/ۑ4nK֛Cd[eǮ7>l'^'x|';~2|?Em{؏ZIQ9g'WO}`O'uOo}ds?C}Ma|=>U73/]n:O/ݡO/ݫx9=chKӽ/\._1) _ʟ3M>{;Ig&WqvuPu/'}&/ G7-2WSY_%Y> C{4xJ*wAw:e1#iv(6. ,n;:k]:lξvën/{ӯpμݜgEӬ1E|}}>w <5 3ho'/;ۻ|1[u{2zv~>B8{/qL|\6Xge}^6LZ4/6YlMYcF&E|Oe-+f"^~0Fky@E7^Ib1$,IJpA>wfޓtfJlyz~?F4m `ꂿ}vc `,e!`9_,Sy_*Tr!=noikbc/Z؊=|[ST[!,TbX A"[®-Su3c!((nC5AZU%)d#!I&k ,lp`E'rd4 o Qg%CesP"5 JapɆK6\r.ק$4\ %/L\2Yx,66X qDRP%o璙BEes08WjfPAMfK6\BfG ld%+e6]%Y1Y"F$2XH D%HD?..ɀ-bpܙ̸d,ʆʆFGV']JЛ,/j4bwk'dAT1s1v+!nrh,=pKAXXwpOK̥DqOHF%R`V~M["4gC`51'pǪqi 9驿G$R- b,bL,N$R iN?(K$'A4O@xFaP7;OVbꙋ[S\d9; jC j_tS1^Y-gh1U$:Q2c`ٟXx"ݨ(sr[3nU R ]w!1WfW4 E^K{6 B|;r;>jgT!ìJ5t*DߵPmvfέx<[(Kr!- @90g\q߷[z5X@"_`ptk3pg-iv-MϵRݻ&i+,!8۵0ʛ.MSH+N}fn|Ss~a:ƯgmβL:lpL;Pa$X>SY `b)@!6_7Ҳ=L(˜p@+,X=Dr b2@ Ml`p%Ic\XZ[Gytť0|gǿ'=3t1e5ʈ ΟO8%W*pFΞMn8*ߑp^0׫܄}q0wq!~zC_ N̥'ބg^/s| y0 gONĘP+S(wX=.sr./*P}T0(y]ҏn\셌iSi#Yh3K'Gi'.K#wo|=:=SyN&/ˬ׽\?|w l ŢvZCdB=9ޒ yт5l&ɿNSۯ;Au7@P.JpSw W6A-l܃l_jaAB`9 \F! ک-d b +k2IbR K0:I nLT"ȢIQ( t+_3 BÚ{0 \ ^|ie_qUϯrG傟.v~BKY;O$udbeΣQ[z-Af>˭\ :zVA#XG0 \= @2NUo`>Ui.lxz@x݈JNBC+ ݾ*\ Z޲>^qx,_kqА LXlYJ?`lYB k#gwaSx;t٫14tQߛ,)&.RxrE- _ӿ.ܷw).V`=+ 4v^Uj mz2нҠ]D{', ̓Ly X&BT>:.9# A RI'ZScZCN,YmcN!Z잗 *՗OjN)Lkoظ RSgAZ jG9OR%%Trk"Οm S 1dXqq&? \H-j"pѻB` GgVUL++mr8VxS0wW!Z6c-!CQ'%pL3ߝkЍnChtWYyE0t#Lԛ0qZ椦gFm|FVX]*ŏ"[8 ftKl"ulJ嫽cAHa>HTFBu"YAC7S #x8I K. X@ & !Oc:pKsP$kClTaO'I3ĥ{.uHG15 %y$`]f*OR1]f4zA$(ʨp$0")H}48i=uLA252)~`9f 4 p84()^xkbP'l&l}|7XXCƄF'pos2I7K`HV (r, Aif9L-F<;o Ù>IAq|0%?n8;&m"A5kg8JBhQpv0aR)0ւږxxx"뷓IѰvQ-}.9[)ViW|ͨc%êf!4Ah.Nя7VlcA`U@`F,R*X]S}Z uȾZOÎO% AY#bT\s|ژ͈X?I}ibg%1iãSZ.R#\@Xl[Xe],S Z~i/y3.r./ ;.f.1>]J5P> <=)u*%VL. pR: lvYތ0J-]v:V?П' n\U 7WvR30}0C "+]E~{5 |N'k:UE MlR"h (`RjyλQ&T)6:͜.aV϶d#z`LN7[ @x2{/uQ@ 8b&2B\iA)9&%Zu`dXc|]*rAh0}Tۻv[8hc$ǚ2B(*#rm+c6rm&((PqϬo6h-"lҡAk.yt189kFfs.s٬d0=m%RD*Sc>M:#4^R"Aɓ *xa< ւ ``ا|T9 @Yj bPUF52ܘVsU8]R,5:V ZV.f%GC~CyۼFcg)6??묇v|13>RWBg,.utz |nH"/jrE{)QLGi&ʉ\ӧ7 IT{e&=3QJb¤@L5$!_֐) jUn*ޞQŠ}FvBN7%vd)ݚ/\DȔ8 - |Knu1(#:uQEӋi-ݚ/\D)t%ʮ<t11?}#q5:9k Sox!{Cf:y`u -[)-?S.N8'mP1Lh6~O_Dmu0 \}Bn{NQL.̮~Fvay_꛻0rC:uCgHi Bk0 O&-&?(PԷ35 [WsE?t@>A|^ B%%~wM6gl=FraL>C‘3c̬ ƠN_kgyxڛ̌ `)Ygmg>ɂR_x2٢fj5kPu-F5^ظ;7f0Z@LҷIc,rv) +L14z3<.Rkd|miEj܅x˾ͫ7Ax}׿CO- SiW`dHHL;E~'Q;SN'@I# `\7,HI:\KfGk@Ϥ=}"9#T6XFUI'-x`FRe: 9Q]^):'a J-BZ ݋Q\ MS] 6a~zI.g@cfAִfmPgpRL r4Tm&=_Y%A9?`97D35?1")vqsC G="hj/^),lͩmX#XswVGYNc.@Kɓ6{# iN.=2!k 'HDdd|kz¨?~v>o: ̬)-IrF="9(8rm!~ ='3~Ę>!c%PcpK+) cj}7ucO =?q.V V[9n_s8-dh&Q[aD`R!Pt7#Lɍs$Gq^54 ޵ -f%-w)l:싺}<u:-N/E:+H;8z3zOzP _?QCga~¡EIدTO+b#[gǚ$S0ͥDM Fk |2qBL_=, ƵxXWؚfg߅|(I_}hc㊀<ڸj!pGxMOnο?|~??kWg(o}y57?@ژݤ]DB 2Ty0:-!u?=acQo_k/,񧵴ӿ0c) S]azN[qW?{|L#kQQ {FY*/!\tu*<@lAn&TJJ?d `U $V 6%x/ݾM}^t~n:+tZIE-1ݼ67w6_at`H>vGɏ`ʺvIV) ҂X=#K6i6ߤIgn{w1qD|v~A^@tyj⽗ɲSAcU s Vmk9M G$ĽkO.' P+WfOF{Zz3S/iʝo}nd ʣa>2/iМ-tFDZaƅ'_Jf;j?ՑͫĮ&ZNk^q!V7m2}X^^2^Jv4dsTѴ[/{]M+=o8%5磕I{ 2x 1NhǤ#R`IB0Ffi䑰H:*"%n 9TJh 1S|VN Z w]m;=8j?Na,Jۯ5,0$! J䡲,վ"sE[C@j5I,n-@ 0m150х !Y`}o83o.P] *eWyy!|ռ;yW$PdOV~ow=t;3c m[Ffw8XӅC.UfEs|Z̩uR*Pvi]zo ~(1O!=09E4fؖ8D*^{ۭWy;[5_'ӛ•8S_gd'a Nɧ"P^T՛Ub2̊ xhy1U:̰X5ٟfR/$\H1"^T\A 빊ׁ$u| ukT+Js#޾Vϟ@hJv/[R^x+ՓEs3%?FS^h&R. w+O=چVj$vd$-y NCf{p;|:v)(MyB:7om#Y$PEX/Nvu`4š'?$G4R$%NUuuUuU5| 5DV]0`N 1'84Nɧx΀R4I+[+&3d3`=C~`I9?X!99WCh;i"dO:acaLb'BTH$^z&'VxʵWc=sk;qMfK%4gO%susJ&K/8-ՐPe0f P,G*LcܺF..7/6snZ)~dzܬ_|q̺qެUY1#"SxBgX+E!$T;p"qDz5Ot`ҏ4U;y6kUJ3)()%hMGA֏LWk1jAc8?z6~.K$v~iv<8\4z.n{"}.D 4c= /w3>:%tgw"{+{7Mr]EqKSB?ғl[Vp;h#Ҹcu#\A5Š4v;!O_i)sڭ1U[h#Һ>I-&(vkAi:FvBޖDɴ[cBj6$ &2Fb?bڭ)UDu yz[ub[cBj6$ 2qnvkAi:FvƸ5TiƄVnmH.Md{p³M^*4Š4v;]kƄVnmH.G˔PTbEy2s"7Azl1݋GY^i6OxOC1Q!K*K-0uE Тn0i00_b5p?m!*+ϞdtQTy/.?uK(Q)E:볐m"SDEE_̧\rG&Y5<~u!ɯfVsajA|~8tZ_!c劐#cσ СBG$3>D<:7>_ѻ|^> mfj.jrp2' $_˟b M>7E.SNsor{PpOQ3,ᛯ֠ ſa~zNOyQzem>r3 9Σl"4RŐ8f%Hx$&TPH(e 5X 5b^y)W7ItmؤAzClw?i$Q)x6D-gW5{R舋"|Kʠ ´]b%1ӒX]=s6fi5C L/2Z»'kHr~lNS.?e͆d![>GgG0ZSJY`xUK~(~j>^fلuvSL zʍ`0#`W֠.tk{2 Ifrհ1_%U!Jk}`,rZ`tM0Zңn*=bjfp(l N7u۫ATd ƻ33uX$=Ӻ=S!HqCp}TA w1Iqz:jjc- xkx0K RYrVIL)NdPZ`t$T>Yw2n~Kfs9)ha[m}S b~v2m(ӾdH <=@3[4Kx>=k:+mؚ61fFHlB1YĥsA*;@5U10l$Oehd˰X'i>_=yGS44TS)nQg5L!=W$ђz.YE=8&$"mc:@I>yPI^XLckfDvL]B=:ֺዓ`IS$B8&+@1X(ARco0eNI{ 6W$ %tnnF-t?W.~p˻K6LFŰ͛W-&i;s~2W ڣهwB4Wʺߙϖlه-@tm b/[ǬW폑saZet[;G!~YVfC oGp7a%D6%tՔ!kػbE*~0jE:[دvEPc!Ï)TG)cXcvs{a{kK?JWt9gV-TF֚B+!ʯȎ[tMoka.aή3p d`| f[^FO>8/ Y'~NY't֯YAԕ?"C}mMIǿ5b@ X,lKo~u4Qz|mځ{C@_jXghEiI:-^5 IYZxX3C$:)Cfv 0GB>O`X1fBO_:߃C4&X ~twŐ5wJ<'X! 킫Rxg(OBR7OBpNtU\ zj:{H5CTtgԦ~ٲ^݇`i< MX42ֆD`6嬨=\Qnp}w̏<عAe*MI_nA=9J?|.'_Ss֖Cu>0ygB4i\?}: E5nOƸtdWu=^o7:9%Ҳ>R_@kžw[]}Fgn?pv.Gx>sZS-)D }N-]0Lxs‡#Wfu mĴ1Z'Nڥ㸾SJY;VE@B%HPpS$jIQw\b) u#(*,X})[:W&]⧐lSp'h,SJoĥS)k3BKTK?D̍~5ˢ.Fid=wYh~cԽRG4"\!"Su1)\]i|#GQ\GЕI +-Lf(ƁN%LgښF_!x9爮#0a{7&bn1˙ D7n@X7Y4B2++3++$BD`B9F Z3ӪTة{4mWjQT\jei9IH KHdew=Ny}; {M֪__ަף+|Hmn消K} \f*T|eߪ{F~qM B֥A݊A]R_hPtRXф0eOu:+YA1R[PX4yl>\ZU euO{l-^_]QkʊnuJܾDW>*MYNu.KS&yCv50 =(&i }22jgOEd/J]Dᒙ+.2ʓ[*6%g%Zr\ %A$G=A,wї4UH*EAn]eŋPaV x'Q7/W(wzlo-9ⶫ.+,â4e>PN6# qӡ USdQZө'G4cA2-<VtY}ZhH㞷Y<͞hjְ=`V}`3r gNÿ9Eq4݅jKBrISj sׅ80dx 'Ge Chgp!v>daU>MXu\$H턾\A+fZ8gX*G ˅D?_d=ݦm0'E }[kZVP>޵F0OOU(p_Vn0G 7 ˁVMGáOZJ7aplxkK81TrU T+73p&u{i3\!aA k3%wFG, #v]zm\$(‰}RZX$`_EFt+C}gaYIw{NS T l>Q9TD[öaўPɩ6x6h.ơK[>o, hWrsb.t;{;s,5[\.0aS$ dwN]e%2ňh!Y$?2bXKQ= I# (rË0vvOW;܄$IβPXQb8)/׹SU5cxjɑͮ .6RM.PxHfMS.sBC m1 WC+ a ,}|r;a؀QĕF:l إpE6 YםK9"d kDN hрjnI[ I1"Ac<[,$[ YoUo>ve~n&П8[j;ŵ4JՁ(ru6n){vEI¥ur#:(#gYE! h02Nk.U.Ϙƽqʥ>$O[!vN-(PbAgΕUXr* `]eDiʗ; &_i1&s᱅c$I2EeקJ*dh)hWn%9m3$6L3)u۞RVJc:ĝV¯5y^\7G-qJY9W;[Fl>[ J*:_23Wvᗬq倀DG+)YFS@\8Wg8wɧ_e}XD҉ Xxf-TVnFS(Hno sFƸV75 ow@݀ioIP*Ov-E.de)DuXfKdqWRء놓UWfR eCZڡ0) AVaC ktNi< Ԓabd'Al亄H=+MTWiV#Tbk|ĔKhK)(Qb;Qw^5՗y4{&ھظtQhvP_%lJHB0O-˃XZ{KMXEd/]}=flVGKzGi/TƜЊKј#! ^d‘m x`%=ERg9zx4rd5Lv܎t0Jl:MĔUe詇:QX}(lDO3]WDB8L쓦#ZF;8~2)md 麺𭆶P`4MW}_S V"wL8c!sN 5D+XA*(>%Je-aUOXn('E' p·"ᣮ(k$|䚋 ~t" )Vp'd (+ 3/# o;v;%)[A  0Kz $i0tH#:BLE'=Rϳ8~}Vtl[WY\h&j=~ Wisw2-d-2x!JkadGxD%4&?)G(@x?D0g&et ^\ 9lJG߽|8$ҞzD3Ԅ (/;!^U\&R$>Mi4>~Gl6belr"zT'<9$&Ȏ(F z8; ,Mj H/^pW\kaF.]g@;R£З>Q 8ISHdG'*A9?s+9 7$q&W15:_i~9蝀,Oiٗfp%s*5A gc/E{2XuAO*ݠ]ͱ;&ur pe4vن$Ws  񣣽x^,LJNQϐsH9=ыjل)s^zِ坍I<p{sU ze7RsH ժV `gV {.哞kl^KF:ӳaz"ӽ)9u==毳۠є6DH*­dhk:"5 jWPt*U#Mq0 8:Z)VpnO??gҸ,o_}_ݗc#"k8S}_1mI0j Qmdzl#Cv`x?djMndղQ(DwZn욥ۮ?*Gw_J aC7 ]2w|{%WZnJz~'sn_ZJYv pgdNq.8yZuBmقWV2~իeVINz̒qhyb>rA# x'E}Zgd ]Bh۪ʿKܵ}mo{ k[Bv]".MKUIeh^b3Ku8.QLV揜m{$]Bm}K|FMK:x{3;]riA]/ՙ"IhfپLZF9& >Dˏ|(0-f(:3.nZ,|^-0ge o0ͷM\?4yIPhW8DEN%" i}\VlKjOK\\%=*mi[a:]%0xۋ%e&!x-׵ՖV8"  մL1Jfw .0]U-LZZ})ZXg׫zdEIN&ْhV9!IϺ}4+Wqq(*{R|'tyTQ̘#;hy 6!#RZA>oߵsOfX&Nm_؈r@ rEKෟ׽um46UǽOwR^_?v>gU>yڝ\PȆ h26:& DP`P!(}k3_^9w15X a U |#ɚIr=NtGb*ˀNƣ(x&wIPlk&0MX49M$jyF2-VS]]U]]"}h{`9ׇ1N8ig tN=7+ތq Osvb? =[؛R߫ )ad;'B9(䆏 h&` XA+.o%B9E 2jxDm[_'!hqGWdZR͞o`(cvf~m~o^SAs&y4@11F cD+&@sB$(", DxWU: zk8I[ Aow^]gxγjaOe/CaJ)ES|}k(v4) g1&″J}ֈgr,N;4I=HZ2)ucTaŔhWi򱴲g0m|biˆf%h p!%gv8-Vdz5K1ǢyQ/H|K?'#|v,1]nx4e}] ͎:򿍯2O5.lH6#F0f=TGI.FWC0Ҏ 6H9`mB\];Gw}9{4s%%lPPon;iibg%Ͷe}zĢ\E~!y%@&[RD%6Kw`>UOwxUt6 l rMAl48(D. vT' +L|Z}R[:~Xgtg1Q ނګ鋍l'nvd!anNvu=#b$Ca~?kh^A.yy;]&i^!}f<Ʒz=`_>Is;F&J#ɬsS϶x"hLi@!!$0UW;Rl|[b^__'"JylH h!Z!c!!1l: TSߥ}S5KXprnS+-9</WVF*.:ˇOsCICzu2op&j2}^#͟W7W?j57b6XMW]E5|kKY1.S,YM}!<7j'^l=W۟FG\"i(4 Q@)ȃ!@"H yM8fd\u%D02ݢE2.4K"4[u.#5ސ0k{:5%Q:wea꿛Ծ]$bJjhNQĤf(93NG!QJ`% 8?@c"FKݱܨLekq43@V3k fQfvwX[9v@E`Q<&I ?kE:{}=^ޡORՐ9ğ'MXr#h흳;I$o|$Jr!V!GhBn̓b 0[U(8P$!*D!(IBx$T(T2̸Ȼ.۸87ƅ51旣ux3ˆcS!]yׄ23uDdwҌhC7JBF!1nOU B)T7n.pӏe%JAqˢ~ )' J3C5qnL53=)Pɮ7?cOjfZm4F<kCR-2Gw^iI?lH{ o_=HCR-U)`3Mm2tlDj`@ N)D{u[{ SM%(f9wpcYd_e83/yۚwG p-V#6@4|k[{cԡsTj^{Euic_5q [pr|*{l ^`P` zi Ax{4 >T\d 7K GT3vtރmƏ5Y}ċbGԌkE̘ߥ6?426peW`[yM #2~y4Gf1/WY_quKmdI3.=$}c)Y}gE<נx3g=Uʬ! UN1IamH-S;Ǭ:8՞hߧoQ׻W~=Ix~m橜6a\˲ljU۬br|5]A?~@փ)%e<_"?ޞu]\v^CilEPYgaa2h֟d=)240 FH D!R]76Μsc w|JX B&[ `E9zŃZ1@U nu,kuQ@ DL1GB8G:VzXӔ‰J `f3&%vfh!w|T)LIc!iL$-4:-ΐ#gh gBW3}[a ~S7 XQu⫓xgR{M;j!NNr-4DO]jVCJ>|}K1hJF7CV0V~{l_:=\z{HKg[1: !p ;k~@icq>=TW^4Į]\ӝXLKn4"_k Ѻn){u_,Q dzN|*:U@[>ЪHZ ԋv3O*oUem&9\Pn y^M dHy.IOs}2F!%][thmN!U"?5'1c"l0 n|p0@N%5*H1v^\ɂ@y3OJ;h7OQ]>&>&vgTat(}s| U,9K ȊU Y.IyKO/1~`JXĄai)"$H($!ALK8+Om5y+fsΊO #1Mgթ{+IVp$l`pz$0H)DHł&1a5 ycHIXKLeG\7ZطTKk \_\o}G=\oo7pА\EtJz{sO5݂B;/8DxZjm+*{b@ض؄|*Sƒ[ y[M0WsuZZ5bBsGcjVOAt tc!S |CaKt]x2gyo(c;2fe3U~Û7ovTlu2hÎ\?=Ώ$lC-LVfy.'{-l*Kcw1Z.'wObE05 L@|x9 FWDa/\ryp|r _h`J ؁2<,3r7Glq|oX_N} .57T7TCHI#$bTA4B'w=@^|!IF#0,f3LbE?aku >睯mث &-S,C"8IQ>~\V笠1]DM3%!,͐(ob:dX!Xw`%e5o*Oʃx\~$>W,iG\5c͎K:i$7?4VќG@BX wY^amNf:HpqKƋI-}?ӌ*=y~5O:If-r!IP]g.꺼7;H`{ f{-G)8B@&,AH(hP f+ QڱH29X;j;WlAuj;mV-PB+|Wʪ5WnK-bEmw+-'|Nb+vJ%G0@'/(U]VĈw%6oʌZi K ;8\|wpE}qD|'=N(˦rJf5U!YPRAA0ØZj6(\J|qD3^Zp$ K!z(^H 4xU~5f @¸-q\ a Cb(ZC̣eEsQ4)$N9: )?Vpg=O =fgɛ=HsNt1l!c|8. ,M J" qb$@")E$Ĉb(˻kJT`hY^Џkt³#L Ժ%` jYRVg$8sɊ䁡AN+r7s[_2HK&|%p;8OjGY`io x9 ci\ڜڜm>g׳@1 @4TXX iy"8+hUnػ8ndWzY/ ƈ7W{]رѨ7 Jp"d7D!~-F };Kŵޭ!ӲHKʷLݪ|l!O0 =UU)D{鸼y_X }!3w׷6slsvS uSXuѭG[@av>bmpa/ng7+d(/Bk! fq׳8˝׊mY/wR_t\fY)=t!Շ x '{27/8Bx)$ ~!XBS%r|zE |[9UO~W|r:?%Rn4U9ۘMm,.H֯`z''il - =01qQԄ+dhɷ,+狱QP(SN=ɜ y!"\!QeaG1`\*_6\VoKdD1\SW\:i1Vv'00ZZL5,Ib sNR3$aw#; f(K  ,M=݀\4 a1)fi p96C|o"Q _yWí1[\_^n-{tRehOb`L­LD#@@1ˆ[#ẁ A$Fy},RgknX' Wx|o+~7;+ Fh8zh_}g ͘@ŠȀu^H9wdKmyz?v{ᑥҾOw9eߑx'W7kqWO"bZW'`y.oWr> O;vbMϟN|c}+BHsp=:+#(GL~JG0)(ɑ.8iejZ@Gi\P!*zГ+bnP\*](kJ2C+k#ip3J1Ylj*=H? u/%RVyk9@)ݼܻ@)߫^Hj~Xy YA8/~lp&wLrfrc>xavjKu޷ؔLans&5+yJrnGP[+w1$Y]j:`B)'{W}yv߁5RI bvAE욁@/;+8, D@)WW28M.>AXx2)ZS"R|0Jg9Veu;6lAZ579c* fqӒOR-kcЎI6G4(mW h^(#< XtȹN@4DJ> g @Dٿ-$ۼφ[kEWlC1qǑ4Cg"WDzft[}]!wōRQ5?53:`sG͓ cD>|Nfʔ.L>#i&ڒxgd`vrszpwg:qBpAA1fmdѮDC QF5QK$Jpـa>iWDD^s&ƍ!7,ceswV V  (CPR,QA,@IAׇOfs{Z>~fmV ^ىfF%\\uʇ" _ޅ.__# t:2Vd90,3I|Gh!SM(KNKe|GaOawvw:&Ƙ%blJ:„ʅWN4RhuMbtJ8"cPD%#tf +2/*j=ѺCzFW0TxzہW~PTF@1Dѭ"~}V#M6V" ;ќ!E6WY%Vy;Whe';E`RlDbvc5M$Fsņ4O'=cĀ8Ð3 ϗ4R5A5 x ~:V$ C]BB!U84c#' ?z8-bCꉞa=16&z%=YY%Cp !WeO׋j.߭9b"Ib¬(05DI >WvNn)&T`9Kp&2%"7BL !Jm&hٝ(8GL*߂< $?H(+%&i8[$2kI I)Ls5Af4$"Hg3@A`pU w'A~ӋWaq/sF+Ɇ/Ry8=#`F5giX(]%VVXT>ֺ:wcq01>8zy0Z͟;݋dF׭bwc F&xmAbb pۣ1qߥ] SDc<m.4إ)%/vXTHe(Κ 9hj2KtPr=^A蠐娇otbh57ZZ"ԸUC=6VM*{`Z>0T0zg+.IS4!8uK0%ˉQg6,; !#m]pev]$co_T_B?z}V NJ̗0!k-sQq(0E.>q B "^ ԄJ7`Zw]($&)fג-˛EP,.of:ͳaGWfY8c>gY}_!3nٻ߶$ H}[Y 6 q<%OYRxxGJz!c?Q òx~U] ]]Rk)o2zE_,Ҟt{tB3':L ՛Ύf9/ZOjѬZ o 9l>RCuۭ/Ҋ)ֳ0~* kBNDlvܱMލ;n21g4nIB̻3ڰ7"Wy8ͻ)\ɽCV*)}Fa:x݊nmXȉhMqw+ܝj Z8Q6gꄋ&(*j0maA@IitʉEK5;V"a1с-1w.8Bs8lTc/U@i@ww!]-bj1ĿR8g4zЩ]7@gƿ6,M0#ɚwÒEeT9S&ޭ;t:Uf}{bFs[r&ZbSGk > V*)}FeARdmp/|F3:"<3Z7ѝmJ j}ƚy?m܉hE028H=`($Xz2ymѫHM$%| gDo&6|2L?せ鴿rA%-Q*!3 QjzoFflbr` sP OooG='R4D/śL!^h[B"?gud:[Θ痗.T{U߹4ُpss/+|"pm`i(t=x(4if1Ÿ6Ą x՜X9!QrQb?ySxs 1 Nnr7'se>,Latb&.?4@[OoD mr3a)|Z^y|0էB.ؠժrK`*|Io6HAg0-ӌƅ98jsZPK{dYZj jixUCn/.R J90 TI3e>@4:AY`t1OvgaJd1ݢ{%4 *LJ%m:QI`q4ȫYR?t,!wVgI/p*iupv4@YMxouvw M]vrZ[&J-m:/JDJiy{TP&-דuP6h^1G0֏-s`4i(s"Kc>bn22v!Er3b}ew4QTxepF *{@<-5o?[T_7r_9sDg˸It0[ܮ @w.7O:k600%^_SbQKMt` \fc+:)4W"bq"CDpAt+lyw&ŊIyL#)K)pyno)|}$OMk? ir@ǖUS+ZC)XmTBph\T=i%iKh ]kIY,qnIMo:wZkɿS-AoY /@#Q.)1Y9|iU5e0iGw=X?`&fӲՍ}gr&_>~q4C* HJTLM>NƜabu{֢9"]#|਄5LH?aZ(<%#9q"lyH.llmV$EIY*VqԲp C/>BV%Զ(1+L>mp_Gp|zT)11^_v>a6oyQʛGy9iMTZENȜCq&9LE@DXgW[]3Y"m6?r>謷0gɰd(2e(BVm\q|M[SmB,A~¶3QD{f7l2 Xк~nSxf#]ppKhO1dG)7{\Ut5do"= PG!~jIweh3uU=vUdiC/^C8.(:r3#;IbXdL7 d0U+aeᣖ$GNg}8G%"Д?EdfRaY B.hŊC+ jLC͉s*1r/q:gY¹A+ğ^fІE )Hͨ:ES IP۟^"uSݐ[>!kGftl}zdlP;d?u/qd-3u"zUR'6hgVFJGBq >[7FQ 2@yleJERl*2Դvsy@Ֆ:b09!Q\$p)8 9`)'%3Ҕ=ָЌdͮRiOSkԥQgQ檶mmA0EM|`894AbʩԏF"ۖRt0KHcC錌pGP"!1n < &1Hi©k_D!XDdP-jC2u&6DɷmC$SdJAD "ud:[jæ欄^r~w_}:P"W{f:q 824le8sF߻^J ~=4 Ti ƵV<،WG52h?،rޅ|M\ Lj݊exH E?{ Xy-Hw(an}A/a:Jv>3_0fDgKS'pܬ9v?W=pGu3PρTmR f 5lЋ3(K9fp&+`ְ&+brWpHng@8,]0+_gS%в 8c"$ׯ ٝuT]S!@X=i{|HlIXz Ҍ|ͩ _f#-2N ?Qy8dfhwû 8xȱVRfV;\tmY@*Ի8ne8j"|t_pVSfԊAMW ?@[bMUxZVdz >ʱ3]uܓf-Svԍr7A6q,B`8<`'%֑' qt1ǻ vɒ p͹SFkV =lV9ůbޮ@" !BWZҡJ-NkaXwaׁ]Zv>:x't(ἮDPf+)/?(Rqo^CjY*O>fڛ^C)W7f8.wQFnoΞZT%bULVqiS\z]p'&O`p\ҸNk8LJE!'tML>f~ By/e‹F{}2cޚ9:RVo'j~A >lR3VQ )XdBQl"bIH;+&\L1|. ͇R8݃+H3o A#.5І"xgPPJn٨0d6Rɟ¨>wq0LۚuB+Ceꯃ(mֱ㕢GugCfWW@t ~' ]Rz9:l~X`nt0XI`1Btuw:PīĻ um}=Ga[s1T #uq>Wmb`+[#Q}CAL{rUY7H?!8r0uݽJw N7;}mkB%X{v ScіP-y%/0Zh]-|V^V)8Tn0]`Y@4?\{L/#{""+b9iT|^Q'gf&J䢋2 -V* Or\ F9$T`붊4⧦SKC=!xyAcT2"in$B5U[K9m3*&y÷5b9ȑ.g6Sd0֨ -_R)بb YR)2}3nޟ"%?l-Y<|I257?;65s4˕zuݭd$=Iږ!>yaOb>;O 0N qN-vGdJ6~i6X:jΉGG. 7~8O1 nQtֱcuys|8ќ:X )Ol}/Fdq[N¹cѱAs@>1S 9Hm]_įYf4~^BXMz+sh{9yW`eE_ʜ0Bfő5ddDx+ȲqN*RBp4z'<  V;hp).hŷl1SV!pJ.L]\}A c|2 )~Y_yJ < 1͋+uyfu7T J ~U&#i50Djl lP+{θpzc"HL/Oth|? ~_ |~wyq"_*u0o.3`}c:e%/%߅X"Ьc3%\7FL'G:z" LXZeFjy *QXf֐ɠ9 ) ә~ |oOY qrހz ʽyz2l+՞2 TPC QYbNZeQ3N75~pdZZC"-Dư"p &.A ީfhթ j9sfֽmJg_Y5U ]UB94gJ y4Jzk.EHՉzo/(#D* b_TL/>]ըt(.߅8 k 8(,G]j^Uߗ'k?;LpD"UjlO ߛ$,􏃿,Ra0,,46'y\Aiki!xӏ y BmJ*dۯ}7M BLPGi&b*9H=h9f6NXp|;NTl"vzÀ!~r6k3/am|7g4OZ o=]7 I՘)dVDrJ-(@ Tj}00;1QۀZOmA4폃Ja9>s'Jb+Nj.H|nHeknQD +25 1*)L#Tt=Q % Dj ׯ!# ,HU+刲޹oY-.Aw /:u'<!W *%݅@R& $Ƀ}6z L#A䤦=1486TK-:.DHI@փQܠsx$x$QJ@LSMI5%80.2Q 9R`#gᨬvVT01[XgrRcXs*]7VNN)0TiAT@A"nh$GYK'uG 觔Ww.-Np`+hq^.WBi ۡǥ\2"^?|R%HzL‰Ƴ.8Ƀ@̧{7s'D1Qh=wnU)gl| qځ[fHQk#WpTM2`Bfb)ֈ4休^0ƥKj6,8? rhUknj&(1`B6TuS'B˯ 3p%kMa5@dA:xLZH. nGP.D+~n7ww '\#PsA4M\aP0{Cc#- e[!uCr{qc/6*ؚ (8)7ML˭VTWk+)䚰3ao9hT 8չLc4;ΐj="hK(wBkÙ# %{£m \aFyJ%J+,AG67ŨbSO(0⹀\ k#x7iٻuB5zcQq{mrb Ɍ hl- yePN67߾B括q.E)U[4S6̭VGl@wYLQ x0Qz` (+EtɁVcɦ51EZV>K!u4W6vsj>@sc-SUeՖs=)HhR\8ٱ|㴿`U[ + }M4r5AP(2#, [-D9t%]䎹ws~{m6kyQ|e<عl߹l[_C!].=@HClJUϖG5cy5D#Td~,( t!9h 2ˆ=[ݿ, !a4<=("dbE"]Q{(pFlfEGQ#셪*"1JW6=4tKnY(",X` N:;)Ynq!GޑSLH>pTpٗtnTvTC6V=pSg  ({mx_w.>(zgN%"%fzz BHi:NoUZ]uL߱g8-_aqe|Psߌ=Vv=f>td׌ w2^a禓1kzݽXI a:f|/5톺) V$]_zxk^C,3uH Ǡ^)zjA>Ln=+q6s6aK5kbpsbn~:Y֍ֻ{ qvFd }vќdkP]0+ܤ^N 2m:p\ܘ&Ž%x擻erb޹C$w@\sژ۱yyAfH6g'ӦVV55o7^FMN~^,H MM,0}Z%r ?FOP%[j\4y u?k|=tv󗣟nMF|?񿙛{x29&{ܞă{uu. yɖA>:O9ljKu[jڊpACAhǖ;JcwNCT8#MZ@}^F 9hyꆴֻM>Ҷ<{=WJl1|Xx[)MG/HÇ ?WA͆5F,l:Kn*m%;7 4F2!Q$=A4H{ڴ3rЏh.><=?Q ۃ ɡ>}u6s^}L>-n"2=$BuicBu|EcXw !O>ӗ؋~u1⢧kGݠc@;1MCw:c}S_TH0*lk>9tsO7s I\沎Sttst>R3gЇg:Ao_RB8|?}X˺KQ$xlc>k$ J+R$՜ Ye޺ndQ oӶKY2IˌT ,On5sD;V''Y٥y Ys49SiKڔyr,-ڕwKQr9r"b֤(Eʡg iޟps,Y9}\9EYU{t-{O%:[N.Xv<;QM8Se0dɜFhʍRb ʤfŐR) ̌!ISfzZq zFDEJgGqԢP|2gPi{@@˟׻hGTv5+R_Xa7(Ɇ~i9=W5C?s^Y Aqd  U:l[հQ܌5!})eݲ>\"&cy)k6׭b$&u] UVʐVxMy+J,0Q~ȷNGDK6u4u?fwl:~'S"m/_}> Lͻ0*~ lX(RxC 0!˜_Nn(8-_+aI49hx?AF:'`dFq46IPLBzDkL'tQܤ|uq~ z pr16 )ĝ̘r;rxrj5׮%9o\.7(|Q;lzU+{ŷeFx_>x:3 fi{Bxښ 'Q 2D*֪\"}t8-Z^?̹e$w9A*Hci28{+7q+E/g{KTxUɫ_"'bD$1)-p5!jq(L&W( `J!)jmsRQcpQ0 3=!FiP]ЂRLx`V KC]e'E=%>g'=sK."X-!PV0LR:*FQPhC 콓NА.آ(zX_^oQdfg <6F) KcIDLaI KQňPQuV(mf>."p Sp`TYP&V0vI4>5AAd>_3On?Y1yv&lBp`q]@"#VRc!!$E)U6BT!5p$(20MJgre5y%-J|1uT4W LĿ~Û_\ iMV/^_%GN_5>l&J6"z-Dz"nf:xZipJ9U= C\Y!XZ0+ qkMJf (+5H -#HBTXKPV+8( !@ qZr\k5ض=Ds=u&!*74U!:жq fC6fxH4sኄJ6&؀>bV`#pQOD(Lpᤵ ZmC AD 6eb +*tKmIE<&RJ{K_"(ZÇ~o{M7m4-b~raJ$?:.w.g6OWg˿U Əer}#=/HZ+\"uӀM/X <Ų;NrZpj^?'+I[9/9=X&d<sfZz֢VޅeFRME4K4뇗_IiBr1(#:mTn-r+ִ[@j:$䅋hLI֏vX>h\ ʈNnU[Tִ[WIVp͑)zUIQ -2F"p{-GvCB^fQXzAmvk]I4{e[E4KnY%}n6h9}i.[@j:$䅋hL ޏ]Jڍ햋Amvk]I-fvCB^fɔȾ݊/:n6h9ݕ[@j:$䅋hL5KoGq\V[&52ۨa%Dz]쁖^wAB^fɔ&PBCg3Ae`Z,:g$>Kp׌!l(u&0zQk!ޙ 1}n5CʐRLЧvtՠH9[CVËjPg5a:;Pwwz1gn[ZLЄiڰVLP0'WPr(9Q+aHGAHmȠg`-6Jd (ܫ!?AȄ iDDRNGC(`g'l%V[g46YEjvzN#Ar@IӀ|id*2^hH "aFr:B%!Ge"JA[5o|x%00h^4ʃPӯB;lhA=XDPzREd" 9LF؂m!T?h8x*B,#rJ-PrZ@ɑ΁)@(JB(.bT^_Xy"!M0ɟwWO7d6Lզ dش펨WbBO ¢b:gdioRI 12#@ߕNA%,{hh2]yXSָ07M.j45@qx'WG% E~FwmMn$UHJu#m9/Jma0--.IK\1C.%s!)3q94k40oon!?I :3;?YCggO*'.O}hfG4Gɗ O{o-Sou eAZcX$,<2(Ā֢.[wݝo^udm@S 1Oh,c͇,P)ic|hѕɚ LQ+0zW ʱ 4] ƭU)q6'C]n;"1XS@|0[6Ux0ط=QSw_ثߍ鼟.0]ݵEKz }PY8ɿ~': _ iiJ(DJ)4A,8v+l(`Y$Trm4 `!|QSʝ/p Iy@= !se*u>/3X[o :y~N240`_W/w v `g@KQNڟLgfi.|B? tтa?·Cx6/KR_w&(}JaƻO7xNZ`;5B($9ׅe[}Pɍ.+o]b$%T1>C ی8w&/Yp%lQGl<l)Zf.IWY *~6;X •{_l lt}+.0 ߖ)amUegB 3 pSiZfSI?|;C,k+[~>"ԋ%I|Ǿ:@+S( #;M?_|| \ /y}d_e}idMg޸)c*xg'96{"U{Ěu~Cm䃛>ǩK'K(Do͙Cܥq 8)Sd <`g (.$L7LΩV)8+t"̓ P&n NIGىGh7ĥf>E6}}|`+ "s+{巇eǃ( `tb\W{Sog3\KvbYD2Cλnƀ!`?iacG7==.I$fqbȢFLbY ϣ=c|-w݊Έ2?yC^6})=g_? z+RE{fhW^]VS />,V;U;_ܘE3W'Y/"\@ |qg:e:<-pUzBȧNR/E=D4/x*1S r҂V9 䍃-bcG";Ԗ%1N%WprES+=geY >`a8| 2iW"*-_FN0;cogvU>KS *u3 rΒ97dz  )z?;3y94c^gnY497wO:oaញv~* IE ա:b!Ms>q^D(dDzj/ٗ㕍㾳СDڋ$qyrHf/¼`cfѻjӗHTI FCHe4hjz`E*zs=NT{*p9Nˀʗ>GӍ@FUŗ|/|A?xKc"u-YjLA4!a/8AH%R& \3\Av7P!2JDSp]"e#$ ,*7=-\!7,J_Qq´{Gn:$QoVǨdBwA% r-+iK\ F]Cͥx꬐-5]>Zdf'+ Ev8JaxW[QϋEFO AZEU璇Zy1HU%ubD?UAYyO6~*]/ =`NiBDE旤{ wN|Cj'+MQ@aB6BBXyp9:ߋfMOF=&ː+arWyfX,hUx{{yo$3C`.Ow&gghI"cK4bW SƉd63 ; 'F!b>V! *hqR@o 1N@)1y~NDDjaSj]/y8cݢN&Yn?t_x>Y{/^Lxt;鏢G(cc; #e5`3BJƩo?2nYSTU: ^h={p]I7J4!13\!+eEFn|Y`q*cQS95;k"FQ 0JL`:9֌0VH45U ZWh1Ap}ՕkuO5% <0o]/^rbb9GB'+O䘡_{TK0z2˃-2EHSY^Ig@u'cBx+lf%njv4$u@&eUf=x_4l|)Bؽ4-5O_]\iY˝/;/4+ O)Jݬe~]ELGunIuڭ*BDe .ִ[bpڭ h%*dŹ8ݪ DtQFҋִ[ڭ 2%4X4s‹Ӌ3^zh2~GZvQ@D=uo>͠,Hb%*%<٤ogYXϳsڤQ_uR*Bk-Oa$,JSgJV)@뤞I0J+puU-gܻTҚQ LdOHgiчH1bDE0)yoӫo'YJo!KMĘeJ8XG{ؼorC r֔$p_g j$!GL)dĸqB[=clo7-2\p ܠ,;`}p_O54}*[׶Bbu6HH!# kNT*,h؃,q`4#cESDIkEڛ7N] 8(S1_2Tj#%W>8hEq%&j}_}p֔f$G! G [scw*'wmny&:Ǯ2Y.n rnOb]SMLzI=2Q^XEM]dM?3r*!%,$TO X?=N%n;ە0ۏũnA_s ^ڔN(NQ3`C`e2ZM]R0L\HF,\j޷̂ʛ%'N4Xߒ '݌!^&z8HsT䴯MN/,cI2cEL`8񎙤D{X.}\\K';<ΕܜLyTg7"kx7p9xD+iQX32_<$f38"~_>Fe32C)J.]\pQG)s4ʰ.13LKb+0fq +0NщPh5Ti;8hƛ6F<RQ/6 z YPA|R i j;v0wx I\H6¨#SzqUl baA,l6(6[>OՆSor 9IP]` rj@Jq2$3:i#JX5旂 "hY֜SEםeONFs(ZI߆ֆì˟-)UԿ/:|Z\ f1s6.*Kńϫ$@ru#~7?pmqU<)Y.wsW$Sp9w@vLl Dr0cTm\ 9 :)[$7Ua&0Abw3g$(o4HTF5!n:r*oOzHԌMA3RI%ag-Hp>*ocAد_RUu+=BBcT-X ZdA(K뾨]hћTVmGڄkz>Li"U(oBEﯰa'򼥢+ )e-iM:R=$>G*Rg 0J+kᙕyA'A1B DY#AG4XJ0Fe1G'4(. :uĉ@&71(T Vσu,gWt : L$Sn ʨ.7nYWP\. QwxC]4zhx0ѽU;\˿x8Md׻-nf/ S4䕄R% g`~V$dӪDbQʘ$Ή5:0,6XԖ\zD.2uo>K $'[ 4R4*\ejT*RD)^OeB6+,r4C,wYN&DbHB:C9wLkQ(R8m|6508snzֺ8`qӭW 4|3S^Jtfu6Ca֡>HxF8PGepڡCbf 7nւ_SYvs5(2ݜH@f9ۿUp"SI(`{>A4n53F#7BQ7J8o҂h|R(`"HSݲ1yz!Pjm"|[6dp"dO {?m9 4'^V(:=A ohA_RbD7- s4DHiN`QA1m]U8頂0Í,p+lw2Bl<s{<7gP-$3) ݚ`rySٙ{Y/(Qtei0v36DEn֙0w\%!*:uΧ3'mNqQ\w9۪|yG*Akf^8xLjZ^j>9m)Qŭđ>Sx I<bTZ=Sd0a' NAy2OIOx!gq*D \:ora`rE4G&^I㉵T萻u)/u7C$ʼ2U܅-^Q&- J7]> 'jor}uPV^0(x;cr<%de peVZf. U"ЪO եe>p,‹qf1n7ݫvz_^Eu`74E^4zU{|qp7oZ$scضq;n=:wp|FrоڬԜN* 3Ram3z/3w}tn&k|Da#غ'kT{jJKyfj.:)01/87,N^ducMwz K1VBǸg?yedVߺכ.}X)u߱<:>a18G1!_a<ǺxR _MY~.|$II@m*,@ri1 | g̰gsHje+xPU2jW`cpߙLoA;pZr]-_HOJ_{s&qAR&OBbDž"&;" 򜰌I*@SWA`wPDX둑>,V>Xƶ,238Xńɵv&Ht%-I焍},[p,V'l fa%d/G9*R8"@u@*W8g GK>Aq3kϚ{^ӌr5} =ZIisKzrp7 e?Tᨊ-lE밗ţx'\졣hV:jm4[qT[*8_r>għMW(_؛ʢI9a)3i1X=Z+;q >'\3Q<%d:͸6wFMbO*? J8Ǥ_v3qY+]q5np0ARk.=m,ҿΫ%f; ǥ{|U*\q* W{L4 ~=ZB$wJ޼ 9ht]jۃDO\@pP FQ2Gq5%G,!ߡxԔSg^Vp> pʼ%\s!mEO`/A4eTB{\D%( D3 yb)'}`jʟ&H@o\i(1=E66(2YFjʵ^e2o3gw8`p5vbl}5 CRCEb՞EHʨ*HͬR^N2m [)`G\NkڬYF6`dF%Bes&)q(d4rjf 47(8cA# YF%(36]@gzϳyZ}\AW8CEr-9G}jK+W㺝gM Ƹ<|b㫸>5CJFvG@+1^T06ҽn SpɇT-UxϕL$&S}EZ|)!FM0^v;}Tne? dY[mo\h`: D]ĚS8,(̳L˃=3szpI`oѫ9¿¿ dQ۬RwٜGI|9wW@3u!#1!i2'fn ׀AK\80ʠᭈLk[j[*ϐT˛}: O:q'RKYMELKJd1ڋ7[-U7IrhP% _y,ZjfN%*ࢵA4w$yJivgx3}='quR =aLXt7zȒILyk؞29e$;&=Oj&s;#RԈYBܒ.z b#7\d,dHXGLCʙ8 Ő}K ![^(6X;4nT]O>odA*IV瘔19CJ˴)cBԜQv>1KiM'2t [t#F<~칇)j7 {/jg}qg}}t8ilX޵q,Be78K.@/k+eB_-"%c[=$!9g83t=_U׭dpE().yCDoHDǩЃ_l>l ^ rĢ9VA] VF9;m(UCMm8 _~kMVx]6[,{SK?_gu;R4yEDDD" &Ԅ괲2EwՔo6D0Z.4e]S[űg@Cq_Lj+8J DkpL, VDEǐ޴H- kХm}X-[(y~[7jQGWlzz*pP0&#˹~ >?~jesX|[!e_/@@o1z]+05Nq}hj8|{/13TpnjWpƥT o凛n/8TVpG#Tavr%t4kƶjL l7YI*6$Q' z)h EXsOGa:fQM&Hk)ނn-X?^g6֔~M&Åzn sp\$R9XH0ڤ? D YHJN:Pݾ799^ ) KZTY6:fS ՖMPƯeuܲXW4"Et.>[66=f9 'F6(V۲eJ#=˦$ EW -`[545c &  XKUY")H.hHFr'1b(Q|LՖF=6hJF(wHQ\sOrK`4>b[1&zEQ3)1~%5plfM@Xx)0`u;g<W?]TK7 @u½غ}] tYv p7hfr|"i]߲(,~+hl2xuZi9џy^5!eWnx;= CATЮShBaǔIEOnڬ?BE3DW[y)V0+uNP%Bnzf}J/Z9]f=uĪk=Z._Q&2/~nkݖOdtmGRRm/cー}\%+iLuޓVYSކ(Is h߽Q D]׵natcݦ.ZQ b8T~O5x rʭkZQr=۪@~GY3ziy2h؇{^ٵjGkr9|Nܻիej٭wwo1 .H}~nOinIpj)QX]x5!‰x.+p @)lTtaD!FR5r-IU9_aXqfx  CQH7=M=m{4eW{1neY :x ,1,YFLǴa '9XrYdnLfBl&;9̆1Ŵ_RF8`JdBGڐ VAq TQV6 ;ٹWZ+ g7 IKϓ|67M\]!'XoߚKrqd[ lw#lPڣRiU#)",θh+Rޱ>0w˥k}KW!΁]͏Z, jP'F}eQ2b8k\5wqw, [Ά8ͰR4Z&;‹yqׁ&mV`@Ѭ~)u>zб I6Myo OG[zA,6goym$Oe=f@hPZ;Eqą0#Ή9 fзܩi]K+1K̈́If쑺_]fH?g_s}87ZA]-ZFDОGTIz,']{WЊR_,Ke>Xq\K i5?:'>.V-'ڐF.ns-pߏBR\L` AGxuZܚ+s:,=p|"ξL١5"cEj:FK{ĉ_pAȗeYRuWxk;k+7~}7x[ח댇# z*P>~my~ X%~ANVʭ\}\b1T}\ OBR"4#9AR*eKPkS|`n[h)SO>):w模B;7UAbRM^x+?9è!gI"FCcȁU#HGy寖0:}.6e^:SEE~3k]rx0υ[4(U 2qJ'zOV\OT )@T`m'1X]q P A5t \F 6`o<`T]ɮAyAO^zk/W/4NCA2$^,IOi@\);<|BK||%g_ʨŠf / b-~l^5Dg +x\BF9qPs,w%_(&?Ax [+"%Su` W8 [etin<1uE6F#8?DK>ɬ3ے؞x[YDO6_16DW&d^jLe=gX* ZU-AqplX:yEґZp%@i۴ :GKUs?QpΑ6c6/(}$Dt~:`فӄė/av45GW1bȽ7i& g 1C/Ug-_KYSgKbUqrNtXs{ώG}HЂ ]T>ZgE3_F s( 3g0tz)1. b@}L>//%{[~0VP =obc˷aZ={Өkf;bza_B 5c@R`ٸv ya%7ꓟKח Rk J)?.'~vӘvЃ{] dPxx>ϑ{!C4`\/뼄K躼Vnt²D }(97H} ſHrj=<2=5SbAҤp7$m_t1TpmQ=rKX &^f0 _~ +U `[?\b0X axQ ,~Zo]X#HGD!yF8] C "x#Dx5q0g?GScH;F}-k@0&܇HtK""Y" :OВ11X0IP }TVц%{…LLC@+$O aO#DžTpr\["GFl ee -dpP[[*hCEgNҔtkTR" G"VdR,P7y48B%MCJ|I(b:"g93^T$X8(+얒cE@sJlb!F 81ZNYW{J f({yu&Gv/wZQHEtLJ- MNhI[B̔4qmN(hӣVQjo0T;჉dp1ˊFl;aoH|nf7]:L\磃bfs"A8qh(z]~%dZ(ƚT=~~W֟K̋!#XOD$=>pu=6lX ?w;S;_8on/|& ? PCw7 $.H돹qLN0xܴ5]`րZa)Z=FB"9_E\V U }#DNc,Fo*YU^deh?B,ɣˁJHfJsўiaEiWksd9fE@3β PJ7VUA>\ 7h<<t< dQ $ЀJ; (5XM2] (~/?aRе܎ST[T%$ӌs^m((`k`NPN@_a(ES7OBٞaI@GTlo8:VZ(2RD h+Dn2`yUS";һG_8nJxA Z4Yh*p##p\?pA^m ݮ­3t R#Z'ϒIP>NHSnə< C TFh^̴nWfTMj`JM^^dwdpM1{i4 \hFSe LkaX1B R[TS8z rF pTF,KBcAȼN%f^!f;CC CE543[]GJU9*$ T[b&^#f8uL `'hcbYKaw.hYl+gUvk!0tI)EZM`({cfC\żMeQ7tMX|eDVsETLS~haaTS)P0CbI:UP-]l6Im jSmd Rh[Kt5R1 4TJ2oIpKv5٪1H@vLzjxv|Y4!Xkq1EBup׌9Gr:pxSܟڀ5J.ƀ.aZ{[n'mX`J}4O`FO r+Hr3XRIrb]].>ͪUqcXQ+ $XCF/5288V Fq*@}V3!¬@[%/Ia RJcT4P vZ°W7 Æ!pc00r;f_8Q#~DR|3+ HɑW3 6t0(%b%*jr(q9TD!}*=+٩"ڟ[gz$W `y%SJ(#۩fBQoI$M9:9e@?\kv_@~%kXv.X"9-#8a'!W /[,wVnZ,\$x9iqwᑁ[۳y .[][YXwVտy "[LJtvp7P'UKLxۿd|8"ڃ8uVV( ƒQ8plb>*ei&62Ep~r<:G1{Ǿ(`NF'jģA9,b˺>/% cKU,K g a 00 ܿ_j~8 _ξV)%:_rafzu|_0q!PSTqW"5?~ RFoH`S]qK9!aLs@p9/(1Uh-a.y~y pl v%FD_1tZҨZNXRcVR2@;1,YCzb{_3LE:'">+pn\]P,> `cؘ"Z`O?|'u#UnJ3Ajʆ ʾ,xhF3xGCݹ Ƌ5Ok̥+*NUl ͵+09)8o5-N">v)iS,@qwǣԔ=yyso4 p" "B4\̓7mx)T\j{\-N gGg|ROWj]jwgx~Ѽ G) *n̐bHpeP A1\6CsWF| GD-Öqɝ'N鴩eLbW ~(kR[Y588Lb9V%Y>TIH"^~%Y5PI$ gbkNUٙy)IÈJXcŎq!ˆs$D\}sKDzJ[P'ady$NoG(W?o*P+X󧹷OOl֢ǙBmQtN>__n??_&7|?s0d8n[]Azp.9>RSRR'vZA^m?TcO `on&aPN\"~_B& fPD`D`I$$"eG$T)@u|17~7wG@9:YO;)fǑ?x=j!HuWnUUob8oAM"&S;2hY/Jڛ\O/<8 fUbVj7fq~=+7>|7n#/"SF/V .j~zO;Ɛ+;l$FR{𓭬Kf C~Ո9yo*?~},ےEo dS3KH_=pO֯{bpԻSjVyDhƈ洢)EFy5cQ%?at[,&WJ"itibD2vBH݇tBak̦64 0ifre}`nc"+itd.''i&D{r2Ӱ92"''v}Ĕ)F4nr򆗠3)0)ary(ݚ$}p+4=}ǩPXc 8Hf7I>|X$sۃ10VS}{3^)ҧÿ`ja$%YiPÎ>e?l2 A zFl74}$'4p]71f_#kD=u %zXqd ,0IE('hXץ=C $'}c!^XJkա/ݎM&cx.N>PK_I \N KS%RD \Q\iἭ7)LMɵv)KEwԆ/0I0^xhM6 MG|l1Eb?w wK oEO/4s[~\_Q!V kBݝ_?ky:-i$D$!tppyݚ5ӿ`Sn8̾S=+;7.GAڴ941Xwd7p-kqs-k1o8вe$rhpp7K"!/ܹOC zϥdMECqbWr,$T$ ,)N(}QI+8d"֓ C\ r)+=RHp!0d~H&Q#PKm!dΒAձ/1?P]Dr\'cr /"xQDcG5ŠۯSeZzL=v;B Zg`)y:j6h!9~iX&oJ[WL陪B)VKm<|-m>ЧTn蠋*F.Q`Ɔ#>vܓDR>_ۢjt)z?FkQP{V81[#^F#YSI5#Ho|[uN|=a'K |myKy}'oc"p{0]/C.$Gf=q]Gyw7DD5rhxz!xBY!krY'#Кݮ/ ٭]U+ rS(uS[8yȏ5f[O>P>>ɐH(к_j~8ޯ|Y1SJjuN'73דO7nA!<R`/C*;Ey!zq@=ƽ${5OAq1Qι K \Y*e;Ao__֥s!X ]?rʌ&dfT jmToyW+nkrdJn+ڴd8BKmg9ie$%Bb L_'~5)R@sv} u YNrԝ1sR4$T֕LEVab;s=B"-o"h?2ۢ'q_1&5u*x@AkHYQՕi$8nHvf63DfOz%ZԲ[$#.-7*۬Ȣz ݹ[P"x ^UUZAmie e|;9%7F52'%84Q()qc(}Ab([j#[x{Ep'|JUuWPoAK)K2;;J $NYaY/Bp3%7reS}@"rN!c\?( Ƌ+ T2\J|Ta'K@gzRH@z#MEM&g8(Lс~:V )HDV>L+*gqZ#[]KHeo$$}D=R$(Ş%ǰm~gnE=|h Z{Dۃ֒nt8ztYF%pTb.5?cTAֿ_W?pRcݘdA͢Ɇ`cLK~F"DMIA^ϔ-o%y8r!b 5G=Mn?SHs ofcB.r,SB$ҤiV/J7H f%vGXJ+=乽 TJS{0Z<&>VŭdKCZ#H"OD *a! \ T³._wSm⨇ן9N:^䣣Q~0F%p\Zvjuce`)He$iaݠwuᬓVx"ES|~r4_獦P h^D6ZsJYd'Ib ;ӷ2Eȁ$xW]YxčpO;P Na5N. b~#e5xT6K^oEBB޸Ȕo9 KFvr=v_ݺ7.Y2u6Źݔԧri#:HnczYٷvF4Vu!!o\D)$B{qż=9 pFٳ>C/fU!pWZ3V˛۝5`Lb VYP2w@aJȡn,Yp_NIegH.FaK3wb\7JȔ$j,55TSBCƀCUPK/;Yzx9i /d $y4*[Emu)۲IuKiwunز;wԈ=@`APqFqh&J лKS7I.k;X,OʫMgP/qXCw|y-FU=4+lf"* ḙ^{c׾ mo27 xh< t[fj6=7z,&fzVOGOW::Ɔɍ96J_tϷ6&gcP(T*4ƻ,pg:N; R2-9ji C}^21w2+RFATheeV>ZNe$Bf& tz^2*A?&z[+'ɿ|{wĹ=X*Xo=ivM˧c4hnL#ׯgvy^KlPW Tޞ;oYj;n"a."O-AleK\H#o+ɯIkK.y/gR7OrWdgHϵ-80Ć1kzkyB oPz*H[3kV3sUv-=/Ԡ8cjse*يŘcp@R/u?ny0 ]pѓ.MDܻŢZ7]wJB{@-UZ8끎CgVAzpDI._Z,Ghn Nf"l'#X"8YNak,耭;4 0pAruIAE,}2VF#9:($JΆ1AyM#,6`Qہ$+iUq\Juu\P:>]E")~D-˕B AlcMo{kXWA{9ϮY̷@ʲt|Ge*2mlSh|Yޔ>4U l܀XUHe ௹,Rpbl,y#1R?om - a x0gTl/y$,҈ Y -Uz}a5bk0 pҎdc,-tlL90 E8Ya)qHAAK-!Wo%Kc&CEFp xВ~E0uHיKG w}#U7/kywg3~c7Ytso/\ aQ\ttq(w`?u'G݋㝰 *7w9hTD}5rj>]V*S=>hZ~1p lN*Di&Z7:A)>x)$ZꚺL7{ƴZA :Hy"lj>&SIrvc%TeuuÖ48z\jt rgCi nm/iLҧCFz*1c\HaS/bkz3pnVe7K%ȝC((,>S0Zv;"kd;wJ(l!ۛ' <"@0AC7N>·9j7~ձ`TCu 2i 2F5H68MM9ϺAPkH{:Xublڇ@}F)j,>l7=.Zqs{;Q0`SuFOܑc''C/A9>+LS(8Ԯ1AOWlPl8bݛC!Ɯt{ZSokd=-wy3g=aq}ILM$WUdͱY;=mț@kk`7&ie1/x /[ZZd]`6U>Ɛ cHfRKCv9"Q*JI+AeFg{g{]x$Khcxagij qIn0Ѱ`#c)yɢڼ=^Mh,lGQ М_/z.fY`zK`Y˅٢EN{L\S_geZ HWwx^JHuH(%֤́;=茔˗Kn]V 9̉ .pXKn{7Z&!C^;鵱<,^usj+9/22Z80ocMmأFpI xCڻ%ϵvññc[+۱fY7졛cFd;nnM~77< q9V`"5~_fn}mں[uv;ZTd\X2T;BA_ketB5SLOсVtCT$Ιz=.H05I(j;<_QZwdޙ'.g')҉WjwӲ_AXX:L+y|~}jVo r*B ( "a5-ywwP=#ȵCpϤ̥CXY$VK@\XJQͪpkK ,C)y]UQ[jHVu*7´9 u~D.k{+/g} yYm=Vْv!@eJ2uWVi '^-g=*Τ}*~30+io2ӰZ؉ ]eAV(y˂H0ډuQk2nlXf-+-*-$s8ert-"N0$ |@pbOvQgIdxP,GjaChx,.1ј6m4RSm/ՔP S-4zC9ZqZϞr:&|>EanJKOQ'뤇zX[ ,ZͯzD7[͢N+mz6@I.EܡE[_"ym9!~V+> :0k}⍁ Zp]P*nquNO@Zx0cI#jⳐ0~+g!p0+tfcXI5봝 5ie*r~)3ۨ/dX"܅^쀭4DkH6tm /bua{!#3e À0Nf"l=X"8YNd5z<಼ؼV^\d" 33ډ cNZ8ZjdSA4Lu. Tul/m3J~8Fס}oQNV@%҆3|i9$.R(hPWcDo9s1 A0B¶h<sCZM/-YudY&_ߐ0ƚ\r`;NZxQ+ݸE"[+t{(W3wɽ7}@#~-м䵱Fϰ6÷ mlܺ dgetjnm]J3 խg3 nB[ϰO?0h !dL,'lJ%=B/IsLTz$r$Ucw_Yo47]褸)o?VEANk5h%׳i(+ϊ,UGJ#O1)2&Efp5$| ZR9fuL`zVQQWN2XFOUN>?[I9sgSI߁>L~^fP,qR܍ʥVcNEmty:ރwsPgIǬ)osS79L!:vAN?,oAav?}8֙pY2QOYRG}T 6{-3b '`2ɑui*65!MR<ӎtet>;5Ovg_5555kl:>/EwZy!6d 4(DH+@G{Va,:e$Tt bTeԺ,PQ~5Io7wEf!Aػ6eU(pbS$b.N.qA;ZSDGHdy0Ēko1Jγsm5b8Zߺ[s֗<~U*^_te' QS<m`:<=抾=AS@Hoݔ[!8TLp۹?+ޱd񗛟L F-%e8+p͜rs6BRp-[$Hb2;eԼ5SdɝrXP Zj5B"va)Q%ӼaIq\kCyNyՁ*W|%1t)K`U߿azBJ'UAT(y`-wL)JD,z70^i+";¾*VRV>ς _ND| .'%A\[ PNԴ%$4'M4 [t|2ӹZOfyux6vNg"yKddžp`(*Ct 5#H D*:4x=?_^V3mِ etq +{i(0\1v~ D?k{]@??_&9w9''wv?nngAzݓF{?n|={}7{ǭo7_FD{=h?a('YKD} `޷KwzX!3}Ygd ub;7NIw_IH+/_1tez ;ໆ!B1 :[9۶i1&?q~59ML^N|у `n}F~7^kspyQ鎍35z@{jr@sluF3=%j%W&e\rxs4^~ri4-%Z/qa4}S~Q(!~f?_EO$ZS{a2\͵g6tlfaTMb/>YR=a SJ1fcWFmJ}ZϗG?^(4>`ڧ6eImݛB#t{ݸx3oT^9Y^|2~g}ψ(6&H8i>XQ8nX@YQ8~WuJ񻪃/na|*9h@҃ߏ92Lx-xE#0֌(89L c'5Ky;H@+$?q]Y(yuTC' [Ad;"o8Br\ }aF\.?ZwX%dBJ-?CF9+D_^ Oz3Q\2c]KQ\Ҕe[}0 &X&Íi2q_fKL})/es/d ZJAEpg2DY=A.)56\i\8XuJƪϿ:8Wfu[y7S+fjL'WZy#PWc*ThF -n).W [x٥UZUR)?D A9) tZNGŃДHg PA#u$DHI贎Dב:]GH"ѥa%7Kol}0402Q bE81 J.\6z%D/^91TP!;;Ia!kBxΝNaNfRGn &枯8GC3 }M.ژ"In]/kӝ aǭ{[oNOjxH mbns2?>æ[{;'q7.7)lv`) 'Ie´b95\ ;]N> j3!3X܍uQ)чVqçq# R=6.;jS/)'jy/VwH'qWX?:b9+/wXqbwuVϷ|bԆs̨1cEg!RhH;EkV"}T_;m0WL$,Pv1vSSE0㴉Xh\E6u@S8ͥ h:@L+aQY,=4dXK="2vэDd5+dD>CȇM2-#$^#~LhJjR6-FN©8j21Gq(,b}YUS- P7jF`J , 2Vax4TU%D EK4kδI5~ &$,͜,8;c@cRfuryXDF)H2GyBLarWjĝ4yD40,#REasKP0CP``FD#p$(0M8KV!o^YJḦ4'-9t#8w* $3XI`;MD "56iH8x0@}fDpBh*!2;O=2PFeX+1p<2$& ˘Xx؄£ S/c!dkH%f%:ȫ'vXbЙ !uc¾ҧ[GeEBK#W )rGeBp8yHYv!}; aK+XZc1GG Ca:+VGUT(9?j9SJ_ZUB]:T#I :%QիP^ֆR9iY>mϲO۹:|" m[S @yi!uQx޶(<`xu6KZb Uh ި?,9?7ZC!M1uti{MM$Xvt8}qIO<5dJ1El\;$T,IAuH6Cutr1pd9<Ɨ(8f&dS.Ȧ*-_&BsZ'W)U} / -Ʌo&X(J7~@qZI>k&P\(s,OW'_5B5B*X$Xu(<5xkKVյ람SGGeNݰc^czY{t\HuO/F}b먑 _^~o}uoب@G^AT@^9Hbd2ȐsSU(XN|KM݋zŶԽ^|u/_dqMfe >qZK[<IuUce*)KDrkFwEڡrdU'ν_cE=?BE< \!Ls6UHNy9?RSĴ 1oaQݔ\Ӈi'_`"W, %ꚁW\jpBA%>L}j=6$'L'wk>30gMmuiJ|xӯnPark||f!U8vk&fPi Cy X/kNx<&N{XLjV /$deR/RpmϰM15 {˳a@&:%Zn9(/$b=1JJI-%K\t}e]w_sV.׻r\./nwyf0n/i t˥G#o/h YU -肁koXv A- SMT|@DќiG| 1rqhMNZlQf0Tx{Luu&+7:<(r>˿L-҉ᢇOiB l{R5Ń-ߺXrfˆs"r] ^f>bnxת/.ĕBt7V-:: =NB: u$t}$tɡ̬0TXnv^b3EqĎ`qETi.^pg2Q{ ֭Vȕw8:b]]_sirٙCLb+9S=H(Lh,MRp TB`a8_̝٠|- xL ] ܟ?A155t\f/c ki .*'Ӆ1@.xaOǰ?Ÿfbǎf֞ͅḽq q'pG6ZG`:v}T`ԡCmjjϚ$F*%+ |2{erdA媵@q(FYФ3l=]5EU?ey^> xvJYQ|mQޟQmgm'y(5aSO6KWd)WOk};^}|iι:X}x{`o/u3,%ɉtY355JAɹw&]bVbɡWB4: ^Sl:uw~T]lܩSwNIO7;i 7J@Š P@*sjH67Z;ZKfo9^^Bch)훝<~!r<BϾf4As%$Jdob !˜P F5ŦOG:t52~#1gxBT^#}TMKL |/' ̃s~kx({ZW*{) 2mi\|\`/Yd:Uݽ 5A̅{/ؚio:EGӗX/ma ȫ=έŦ0ScV"SR`55=_&?7GνP)Z}\{><7T]WЧU>kB"Y^/˳6y3WQr8SQ%@hPNV:Ȉ@C|I|Kjg:߬F;h]?maK{LDk-iXs~T6~1~jZƨom)|=z$$zb"5C2m`.sQW `$F0 V2(y+E(ȶ$mO>%{MIڞI|4Am[)7RpBWc+U1XJ*ˊ1`4"= QLg׀CxDdMDn\97k".^BwYtjdv>W-J8! ԊmY1C52 X(Atw:ۺc\|qp;v>hPf.ZH@DHL.f# y|mNkW+-$XE&ηηM̽9+3]w"Pe88qT,'sx=ML19^^=w2xFxu*kr %1("M9g5M&ez{76Y[Wp:u zL6ZNA?? ,'L y[{~I@$~ɈcJliUqx/A*/)# a0#a 1tXɡKsIbjjJ&s 8/>'MT6 P/^,| @kLN?᛾ZN?~:\#s'3%,$dhb| b 5bElUC|GpgaϬ~CV yWe]Kͱ+ڠEa.@<5I 1f !^Nڢϣ>`?:t>Pfadh73HSX|oNE_cUȢ{@ 縇@KIA}Q9#FλGF4`ALi{;eF/D2nd+eS9])JM1&ѮJڬS3ݵ;۔;1vSO31{;O$N>ݫ' "}˟~/^z'=~Qv)?O/'ONwW ~997q}rqɯcz蛉&n8Y. 2]gsW( REd.&1Jq7!gk#H7b?kqOm)^zmfy=݃-.$ WP& IzwWi뵴i 1g *i8f;ی?UeE KƄrzV5r`u<:ic?w7l&_#yNJ j]J(ƅ7U>y`y=#<+.X,YW9bj z_jKJuV.ބr,mgZ3`EwrK/OO.hsn^oWh0:zh0@mIQڨm^ʐ٣ 0zŻ)l5Xku%Ťf/aP WJɀ :;XY<+4)h/;B}%JrR6.trMu "[žYcjmN8RX\Z 6g2CdccpMi4K{vEf9tL D*7y:N-I&$ٕ$qDI uT9P[7G]աXȼfn|} Y0 u^WXa-MA0LaC:^cF6)xx-@2"6M 0W62&U.ZxeBP :?Gy3r5J L]6a=D0Q;B́1&7r~R<-GJCB Z ~:+ȤD2OECJQ&rɔ!PȘ,b Sٶ)RboU]l-e/Mmio,iAG[^H.֬"ٚj\ӈ[ t ıog|?sVӶr5NpP & dz;~JtBӰVi&{2]:X'zs#86q=N݆Cp4a? ̘^uF>9 #zd ǍҠi;:m:9)rRʳ[rhA'$$>HoFRMN._Jq!$gm@I5BoVB`8G=kiXY8kg ҈S"'v}q<2Y:x#7f99GYT$GHc5,ЧАh@>oMl;-y-9!ֶ s!i7$qٸl<{$AR3Ym_bɩ!۾5fͰZ(#eZ06d =!bPO(P`wl&'{rդf҈'Dj 6yg{l, v,&y۳30iX};Z }D#:H!˲ R䈣 4n# YoMU o0p `w{:b=S3?n)}RO* 7CMRvbKbXS p4iU}7Oxh@IMIYZ4~hn^~Z?vJpn/vh_Wg72W?o:WRӜ~DSF ARw))ue0 >ɦ= cT$k R 9=K4$M5g{HM: 1DWRS-{ʃt&ov$a9$ܾ}ho/d^HW*mssqM{E9|8H|% $=H_ ]_5msr)cFI\ t+r8nu%Ф"ƪ߄x^l@3/49s]8t]ۻKfk7SQNu=Myhwvg?|bV3%wy&P(,+D KI8C7lH[v`P{XFRXrыQ.2[o{tZ:9?0kN ⓝT- *Q'UO- M$=g 5.pNM]s7KJZڨX+H=kls.Gyԫ|EW@ڤHu~4,է&G̽"qf>9rDSUѥ`#d\G֝۹>zԔ-;GR;j!^oϩĽ}РFI4h)]6)ԾB4C3y˄F@7GϾ3\>.Vg]_j 𗣷^/j368IH̜]=0af UX=_&|$D9\H3;Ck6k^(WkTq9W1lw]N @<_BzK辽|:_L2&[?AgbcgP9_vߟƋ;yeҒɏV?''O*32*?_9q>r o$9{˔˶eO]]]]^Ny*oκBY8A2͇O,N㧘M)Z-t?/p2eCkۀv:EF+Vfj.Yv YvdDAк_:G1jU&?8gpM_/Y޹lffGw<~Dp~w B?O7Os m\Ƕ=}&]) ds&hysȢ*T)DxWx6FӝGEB"ʖy֧bJn, A&DM$Kn eW,@7xoqatdiN .*P֡hv=-Zn]-etŖOS{rh?m~H.}k%v!O 썼/``+(P2LDIwǛ[6,ϓ| ٥j,y`.d@b}5MXbLA.\m  !{A \&eLPV[@)ݑq }X1B֪bYNV&c1PB v&Pcfx}hun>6ؠSrah  am-l[Wկr^ ZX< Lc 8Yd"ܴvOLѠvC֋mJQ&˝Z4|&%YUɕbQ3[~v8L)TZi; X[Nn7BG{Hc U&⬄"tN}Aw-l5FLAAp8jjŏ=\(%rP~6gd7C^m0GPf~'.A=7{2 q6(s6hi6P;FBrGoaW$ lŽ'P˻ x=&ñYɨNE˹HL%ayʶ*&xTR! pYJ4`V2;JQgSG8VeT@!I2"Y)(1-[jUTȘs:՗mK2&k(?Jlw}Άx2ѳFv؆ǎ0)* o1 q^VCذM&t+G; ~p|'_i~ ?{WH2@?MK/>i|ݱe~  ¿R?>}Uթ$>f܂U0e>b^X!!NTG°70kR+q86H$Z~5j[5V݇+2t6ms7%V,Gm=u)%0SPpud)æPyo %Z~O˵ Ͼ󡌊&%R9}i1gښMa0Wm Y/Ы:jht\,Ǘe}F__kVyn] \-⏲^'SSw胿E:ouy4]{i3u#]Y]t1_+!t~#iNI$'%֭'…Obj7WNJ:=dĔp ѯB,5sE[Q7\kun_'(F {V?HU(Vg|}}m[zU$v]>skVjvjOO^MTdDᔂP(򔀈"2 U$AthKzꢞy_FQrlٶ|(-rN}ٍ?gԜ ÈWP;= ڹsz\V0A'"[db^Ec4ҡ y OhWr%.:mBa=3uk2S?U=a1? ٴxaNHXi|YIoKSr#k􏟚l:|: 㭟ly槻-G* ;k^?)<&@!b2R@NmZiܵu<$O+uJy=bp7V~pQ:%Z+JEGNSuU(QƉqJ(͊O' 9:·{8ADcS7A!xjm`Ągm '223{wB^r8Fb2- rἘdfeCfδ4VHDLhc}c;_CLO`rsNy.TMdqҦr(+Υ? /goɒ1h|:ɚIJIL[R=QKHIoovSc+$j~#_\/2rU4ˣi޴ ڼ:g4[GE 'RMȍxֹt RS$~:m*[iOڝ HDplAp7@kA\G2BvlXP֎n\.Vȿ#M,L/瘻BӢ%`$΢<@B MP%b̕CocGG))f3ɤøL+zO&hԣ%#{ S}O[ib>hJΠcs7rxQx+#%pH{`o>/oX #n*yH'8OJJٸ1|!i7!%eN6d5`=z ZKXz ;F  +Pp,ͥEe$KJit1`3y%JUp艨l۬{pUjIcs*t8p/AHv{|º@·Vʸ*Dבu1Z݁>uBDWpYs-G)2_+:d(&̵ S}-lPD*ϥ1R{"GMjVyA3zEGW:+ABu!5~F^@ܩ@y;>k@do/.UmA40,vE95Ԡq7=Akv6 T!HEn7+7#CxgOR;*8;Gw Lx7PCɧ)B ]dHE$ ϭTQڅ!\(d`iT\ܙ0Ȉ &K0azf:;WbʀSrS.0闟?&پGc׿ӿnt4]*(7];U_I]|2ͅ)I.Q ͤrLA;xz?SC2IhzQ Xa%ȶv.a7`'ZhV|dfnkiw$X̋&dKˣqZCZCW>H̄n:^LQ+{Q7YCu$^yb.hg8+EȜ^ݔ8ɇzNʇXBE|Ch|Q 3=DTS;&.X!e3K.13eQB K݉jG.(7(y mDV U+(?B*scH8jHy9Og V)  HKi$YL c{f+6#<&Dm5հnfw&ypn  :V79dʮB.yW'ں>iR0d\ _׽=u󃝙a.: 53P ى^?y7_,eyCB]1-|֘qU\MUKϲhfG/,zV+!o?VhGߣ"y4giG+eߛIMzN[g"'.q."&#U?Fy*qVb 8g022d'gH\f)EzPrH!pe +T&L%,|_WYB%il[N0B]Motf ࿊՜9+M}R<&ig# )*Ihfqu~ԆtY `V(6MoCZ =A䮪^,&%P (zQ'en 2祍oӜ#*r[i$Cr2[}o→aڢ@IRF*XQ <'}JPLD.@[5,8=]Joy[-(8 }l>L̼1E-EcsW}f$ <:&ľhd^dΥRֽJ*YWSV?VҾ4n(Yn綨e._hcg6;Y+]丑|@Ү%}i RnOϖ//+0!$T9%R2ǘ 9>('Nɗ4$[$H_7o96G 0}}2]ReFFDZ[)̾胷,ӦJu%Q@g-VJ DZSWRjNb{i "$%1$c<(Q+J F"O~µ!KzLcٿ>x عV ū٬H̚q>F>.W!@p wz!GZA|yٓՌaf^@[<3Z )Oilt* ?O 9SHSXf˹A:e+S[š$˜5 0Y!@SPgr1C '~qhSګ@[iptqU7@\!{QN MVH y0|rR"9?D nE Q dDݳ"cɹ]n)6S4^fw-,>Q\HIy+uf"q2}[k9.y( 5EG)+OWVoEQa)CA}`j]~$]` ("q冐wʅG4 X4@WoK{oJsE!ک@mq zyv6{WH^JsF'&ZrOJ' ~?>ϰ4G(CFͶz늁+[2UNn{0ͺoP#)u?l\]*>y!:B_M?e$A1ddn޿̊]S;#{_ ʽvxt-7q](Ħ T@DX3޵-?xi@W[b^_r}P+{G/^zw9J[~ĘCSĐH(|[_Jw .ؑȵqpu}3aCuSbUJJ1kxMtF#Jaٻ6n-¢zlC6qs[IbmdIHI}{8#GvT1bσCy!m)'D=^XcFz̨>"Ac8&_=6NtaC]%wWz<+ miTo4,t;nx^2(5Nd^G) ^5&\lyOG'>8F*G|7'<Fh!m- Zp/Kha~]o8f<NxoATpf%K h-WUwK6֔[w,蜭IwH#ցn8c&ī??B#ϾXCT;@-/ߓ!GG![~TCr$~Da[||[FzgI}ZFu^D D14me\ @} k8 PLã$αq9sRy;iaT7'ק ~cB8.hZRtVF&3MdznetDn#=%ɃL^͠3u$4@7+VzF1#dkbAzRw']PMݤ2h&]1Q G2F8ό22Z&S^\7 @stC5tqC^]zKAk`tT6O0 ^!aje_'L&إTIȇn y/$5 ijenn̜6r<@h:tTs?t Z~&H-mD_Wz?frVH06MѾ C畦i'[%1{߱?l=?3:YVfgg8K:I_vwSm{V/)W/ƐY;M͆iWLA ^Zzd/aW?<3Gd7 Q(ՏI[h98:A9g_ G/|8Mpukot_ysw3P({ sscD#zksr&ϹT\! SǐUSsx&r6O)$;x{`azVyL^lC}n+s;,IspVU.|-ћi֟cqz~5T贸z: ɕ.%֏K* ?Y*\j UmlnOFcsTr *_c{o4:*XegT,TPˬ('˥l&;`C\sٯ+{*=q&v yn-BI'ʫ)I܉Tg?,K7$\ dX*㝺ȃѠNz#5!@ Lps366Y7Mg<[/{>{4o{'ifgocg UW ߌFϥN8K@ ʎ< F_Ί+իP9A^EW*<LKQ ^ڍǐ.B%Dzo~EH_:xx4x٫d+L߽=-]22ɹըOSgz*YPTKUyEME(=yRBX!2H1?Sj0Sw(S2mIZH#FўxccTIBY0Bq8F{Ka#˳J?~nCpcg S9GUzM)u×UA9l|9(эUoU}-/=N.xFp3 "]s4U0p9]Je2zqUMr/K|jZ<.čV[x?7[x+V^x\E1TsV1L"|1 GQymJ/EOY%Wg&y~E@WY 9(J ǤYm.*Y.W\&&K y 4KUnJ=y8$2L*y ]xEԆH9oLoꣻ$: $kyN TϲȔ:Q0yd2B.L7-!}]Xգiù_m#d\z8|=6QDj -W1qG4wCK"2Rߨ`?t~9kc0inؗ4cmwun옓pvY[!B .ؾ2B!a2졕$*5N<72b QZ \#a"!BH!O : ɍKiާ=W:Ѽ:oKaùXӲ4BK# wFO[]T-bHTjBiFW`lD1iHW7 L#6 GI!8Ơi,4%R&9d;w9NIV _Dnvvvvzi\ 9Yr~"X:=@iyUY(y, E-JQb] .7P&f,+@)[*l> 4@#&0j-@Vsx9`^֊:׭P?%!78`N"t捰2"&XUpYbJN#3 XW!zh|](i蕒mTռo.6IbôaZ0tX_^?6wQay#RLRT8b !׏m o^?^]F T&$nfT_| vf46<^2MɯRh͒i*V@vSau||Ê1uI_@q]0/WK\ ll&UoUJY(kj*sL%ij8\y۝?.yVzeߗA2n+=q!LO{i~ѵ4/[(IvU:>z1n*+ը` h}zu%edKXHM/3(y 'l}k#/u6s;UrSTU]hTۀo׌9_<8~H@ǭ`RrFgQV+qUUxϭ$cL(?*z0pþ#۹ӆO 4ޫv;Q87*IjLh@I{%x5 9 6奣$J `2Q2Zu+?hH (^iॱċ5v bm(Z!&Ѳ "f040 vȯbֱ/ַ&=U^[TfNˠh@3MoB`Pɬ2I=Zຌ'4L>AtA>(ݦz!"dƄi`ƙ(ᑂ& s}lozHm5 m.?e%=NJgz(b¿%Q[|߿K*o#B< IJɶrJałlLA0)hb6&6HL7nck[8O8`j$XB/>1TwW ؇`1 וr"Q$?#Je(p8b$Ғ)~ùً\R) גm+Y'(Rw%e)+kHO ] @rXp,ea9q]o] SL0"3, ƀC9gss6)56JKNzl03+q|eR{FIN3e@t9ţZyȔ+YLlvD $i% g6R󺴄})Aq}V*zOlSe%) ](FNihWK$%^-rԢqRBC)(‡ĔD-n-S_hxH8-rv 3xGȄ@`g((&,DMA H\ڎjjfn321I6=Erc$[8|ɈT9\%0N^F(b B䣦O<3/+B=xπ_28!gD2dEԬ3gzp8ֺ;ZrCVxu&_.?)En,UDA+f(HACdPΡ&##`]PZ[ Z8T,*Y hEP8p#mLUH 1JBḍwȰlZ!GvVG;JQ;Q8.Ո(D~FV. ѢX`*)@e8^T{ˁ2+{ȥ6mxAaQK Cc]'"yKw㜢qr 7vQjE,-onۨ=_\^Ym0q( B[$ k 6EH,d\~-fFvFuy \[-- a j)膀i2n#)ȚGÝd nr9&2d`)c9TvA p5y/<ouN(&x)^NLN.DS'@8.єyļDS *1ĬER&A#`P-JK˽螱L 噞=RM [g:Ez(<#|Tg(6qB1FsAp͝9f)X%Jҫ(w;`c@G &!M Igz+!d)TS=Y*F,~h6CHMz]~R3yJo ɇ$S`t(4Q1ȧc}#!5oFj:Kң9ÍvQEm`$I o-r.J1iƌ> \R8o~$J~6~_Z]ҹ:OjH3묛b|=[EQ3 E1^nL'aC,a˥&ࣗ9Ŝ0è 3ET;ȉk>vfV[GmI$7qY,;Ь69/-q.ŕR<3Xѣ$6Pó7b|$ "Jڳ svQYbdNeb9G9]m 9$Z0l+s.j>;Zdk@˔%u1c lݻV362Z0h5;h0fIY/SN?ޖ["!z F]gR25pD7'ځH]ՓX!2[Յɇ~ss'5}곴oNg&#]7'[pkBu5AGȿ}$ϓa[Q 'J =.YǔJ7<3#ƛ,^ߍE&fL[vh;/,+,ݱbqSy8(ì-ȚcG *r:on}bu, bɡ';#HVnN4ܪZAPG֛7:шR,Ѿ9A+w!nզJƺm\Q[8/іVČc[,7{RzEڂ4cUv-k:&Xh-z]ePɾk:ǘ g9ܱŽ`0{vdo.fS@>Qr;TQҍ2۠LhV09L..^-hg3nKl Mc5kn(cxA1C##Z+6wXϨl5s  }kDR [kr[*sC4Ͽt!Tod"펍݂“1ʰEF`(C@XR"$)jLdW2KIn'L =0%6S4}~/w4<2֏?__Ui#WNR&9~eH? 9+o2lvt gUd6&%G<1i CcIQs& +Y!?q:憔1_P 8pAU (Gܑ!T=&3y;?|c1K&CY1ۈyot*S;ۮ5osw'u|}R_sGhDb~Jd%Ә1't@{6ͣdU*D'e[%?9^$:C,Q;UNAg n0Dr0p)V"z # g%@< g}P**s!Dhj*!UZ }sb[cANׇ/o]|xNQmߋ+ WB2\/䇧 :QC9)P~u7r0 NA7#6BYyZS %cc8JacZ}⨐L|$_*@@~wIH.nϧ6|ULoX8eAsx [vx.f]g:ZVQܹpr:mAxT&9OXZz ,ۑFC 祧3K2vq:X*s\HZGdrFzI_J rј2ބl5e{9r G܂J~to9Z [{A =e\݊4楇P5d܄ДL cRZˤ< ZG. S2Q %?Dw^z:Az PR]X6GK5뀖LD~6m>+3ɍ+GCߟն݃xwћ+8>@EV%_V|7JooRҏ=2>|3?1ߞL:^gi3Kw;4cz!"BR 7"-q6s IN+27] E5Ӕ?h(ꯃu`mbmmb*ȝYOQ 0^}\]S9n`txl5RT$oq0n]:oA2 &SPLoXǻٛ :/ұݧ{jP9zpwىzo^c޿M69M9zO9 5W}stpr%^9vRqg|NZŦ;/$%dRp1v^I {ƤoWlzA{l"F1O٣x0UpS/!wsNՠ^]ݦ_3=WXznvZ  FEo__b3qx.ewo|2!V!FEI*t&kL$ p"d#`@٠MqvJ CLL%/}P)ƨx,l9tDRVE[Hd6ٻ8$W|`1Ey,Xϋ "O#JIZ0}#dꬪl,UE~1|)U)$"T !N)cLc2Ҥ#͆AJV ͚ңR ՂZP&TR£㞧KS$T(yBiDR#Դ=1Ə7}L75dR.o Tij=m!#+ GPvb-׽PFTA)`bSrēxooS?V)@v&q ]fUJ"E[%m9 P'*FuuTkLkri'樧Ѣ " '-PCL\\hhZtOc!P-8l ȞoK5QDWmIHlL}zև%Pt=4ӫk՗*rVaj` W gVL[ӗէxK‘ֿGU735԰[wYj^>:Jғ뢛W]ix1mY`LI%Eq$7[B~ ti#at 't׽*d"& ELA eۣiHD[‡?F҅rt?2koMpݾ>}v]\_/r(Obţmz_|IˋkB`CQ{^P i>MB!Mb/˛7ۂ'99ž@Ϸ!f/v\Dk&#:y&lNzplR v6ސ9û5wGޝhi}҂ f_X6:Z,ǻ7O} 0g_N8CcH M78kk6|cK!-%_];kpĚǾ+"έKd105Xx#@H`!Fa;ӖkG%G,HO:v02<Cɣ?i&u+rqNӶcۀ-aѷçGZPFa$C^g|5&2r0׃t'UC׹k =,$ZsicE Aob҈G2ekOdhsi A;KFY~g^@X% OPڏ,jR^L.X)bAiyg\ 3:3.LSs Kb%- R* A);ˇ"n%G nibA.41 "yP) @t7̑PHTIZS;B L(.rePRR[&kpSyܐ \W@3=Lis}BnJ,Ze{8\AHp\RIn*~J:Gl9aVV t*\) l|N"9;ؘԁ\.E{w9<ai `Q6x*/$.W#o7Kprm;2?eu/{(m?UIzl\QH=;$2. p7Z2ODDLAfFe'EG^*4^ߧzBNM!q 2kDŽ.k1ײ%-t.ji۬ʉ/G(yN\3=Ųh b0 !f LEW@GjZ8X8_|(|xÛPN,LN,, @YX!igaF>ztf'%>/G7V@1GgOaד1>G %`&Iv?魏Hj=^c U3qm̯?p%^o4 |&= KTp0KF; dl[_Cs,C_=݄JM|?{gupU ^>wpF {L;RYV^v5ڋ.gdjN.VF>~t=~xb!F<@fodף3k$:"Uo  ƣX{UB[ZǔIKB @k .Cs؞><=ҟݖ3ߪW QHgL7"Ri,HeUwK~T'n_6)qXrbx Ϸ JA<u6$$H˄A eYyw3ѕ]r|g5'ݨI)-cqxx(aZM RJX ZD$8ż!5T MfjFoZVG~Re NCis `Y%cߠ'euAkezX]Rd%sF$*;}bn$VwSehۼ;^zoӮ< YڦOדy4L?]n$^-jϛٮbo<_嗇x?Qp\W 4n1;#caZo~Y|>*34ZϬ5IBr)Jx:pȪv36{nN7h=$)?Ckꅖj>$+eJ-n .h@Ji>HYEr.duHIǾ-%;wRkn sqC1tDg8o(hC 6Ҡ)3lR"/Z[Aov~PSě 8R0JBA6F,M5B;OmCU@ hߦI[Vɗyؑqmc1JE# :_׿P~<i rUʌ4OdlCzlo~EOyB o*}^t+s2kAJy|PkL=Id|^2a9۸8cpfLF7LesaP ̵r!-\>7 =К08eM1ܳ&&RhmbT'2j+(3F~x_G"IE7O1E.q*p',q' o.Vea^>%9# 3#vA)X7{ztf'u8э^< 10Ӧ:sFi<%JayzXO?y|'Wd:ğeHN[qlNNIJ)a=>Mo5K*Oѕ89ki Y ܙ9ΧwMH"XS$x%*dp8d)2=}o0͘_@ADM#\Gs|;}?pչ}x⳽rO_dŸf5vsI"0Z Jqק%x^w}=x"1?9~~}bB}~/sBeNHl'߀MC67qIG Rn(BY4h$cEH!qFL|7ϫEe#egRWy\s؍&?<˽TBwa%PGL\lJ Y %b#'ha(QYx͸NgÒ5\+|${ N}q=e=ѡ6:5V~|P{L"w+q\xFA6?c3Jj֌n䌎ȉ.m4x$q T!i)qc ,<)M` O>F-Uŭ>&@('#r0Tb,)>-sF=ق %=_<4vص}Ǟ%5WV F]4N˂MkB0Ch&}$sM{h%hH{/S/s eNa)%O8Ew]hH;")n>&-&y9ekfKU]"~|gp"GhД!PNfBwt1VXDUm 64hL`@z<b(AKxo&Hj.섥 ’Qb#N& C|\rnM?{WƑ ?6mDxzW`ȐAl= pHَ3Crx M{fU]]UUM$9(R`U!;:j.ƅ j=CF$ #W!{ǩ &LyWuY5)adK~ s,"QHa Ƌ$<ш0|&˄OPoJxX_I*1 (dPh0};A @HJ@Ib+L\XR!rerҏ߇)viMn}EF0"~Hh{`I|3`AS>(~m]@DzW2azcFZqs]uteM21eq'wvt78lgϭ˶Lf[< -hpgͫ.HB"ǔWvA<>~zfh+?{ZlƳ+/xQmw-z=^dzq|4-2!xuwkx= CݕYv޸?t㮇 j\ob!\JUvWx'-ȇ Hj%}),W0\hD`M}Ihк.q뭌^K AyauO8Z q+O)[BZ-e{WsT %Rh{Ew2Dl6^S|.06'ùVPR+z (\hds}B' '(%=A%B0ͅd C2nkԸL1j ,t[XY^5o.t=BJ[7dTB*Q&ígװh>ج. , ǕI5kG#j6 (pGt qUQz{Ws '4,I=Ty}B@R|b!!\YSS]1#ؾ]vyI'!-d~ư6W33`ڝ+kV<~i^R3q `W»a8Lsmvs /yXwerQ XT0`_;(%umRT% u]TdQ;]RX^Q} kƨ +N\ҊIIOv#U$OZ(W$Byx F]$mIRK%gd v|eŽit>LsIJM~:,hNjPNU튭?26E7fl2];?W7Nk=9 ,!Տ[e0~Fu~P?ZHk8t|wȽrg5zdDK"(5'BJVJ;P~-;$p'3J~#;8^)%_+<>NP\voL;;{dzK=GL+XRu֞'B3(\޲ӠQ=loM3[8ͱsĖ-`@R2[>[b eKNz$nPљ ۇA 4~LI躎2BY#/g'0^ege ICOfT۞ g앮'X.U?vSb~r1pw4|Zn`ﶶHϟOv @#va>gՉރ=`yeҌXωQl@%*x?ν3gϦu>烱oɯgs x$^jtW_s_ҥ\_?Wovbzd ]p6w*SEVꝇ3]&ξtGK*qO! xs;п94y3q&|6WO•eΚ2;OSuG1*5\@b+$Wu? O :M?A3P)Ax>[f<R8TgV>_ohߞ=^?< y9߃ hfbG_?a~nY C K/ga ˷\ŠW'a:O;GC'z B>vv%Z6N"6)et8_}5^"3 ׿g`bHT|2O] jxSFZ\2qOtt}9LoI^ߋ'dH&Ƭ]ґ̣OTk}6w˹||?}Q*2ЄDp!rXJiU)f:I&brj0iz8` %hY ,qٟL[feƭ75KߴM{Q?{Л 7-c3W /Y%$nȂl?vQԚuMU6}Ա:#$U(@RRbq ry? b)[;Yt/P+j_;h0Q=El ?PB_{!Q>>HL3{3%3pL}~IgKE8 Aʾzb-0S{Z,n5{^~PeY1 'w{nNɟn@3zlȏ@^H=-T,0fx4*4MH["pE) "~=k5&^KELpAy@OP`&jDH`0DQ}i*Y CXV T7#JOahȁb"Ry8F [jR<` 2!ƤF00QawDKh*\{'T)%R,Kna~DT2@A5}0mx8\4qgӻG;sTfn)YV0O7"a٫mRt6IزzhCz+mv,¯CN.A8o}OחZ,uQԥT}LK[[8[9tUE{U.}8;Yٻ7:ˉڬԶqHU7J0$EQJ^Q@Ic@P*B%$S]j\R{,C |w6OF/A\AExĜټ|nseQBJWiSi X`e[W =al"LOFU$<; ʛCT: ŔJ֛JjJq5]źQZ';WiE d\_ q.ǎA|Ư?floAi#%$[?y֯㗹0fWדK4ղ-$8J (qOKj'O]Gf}Oly{'Ʉ*%:eW]ѭnǛKoI(vHe^pxbbGj0LdϷԨqG1++|䮑 yF0rDGig \{Jgq=zfH} ߖ;@/?-//)b?{"oGl1 \( TLADh 5shE ĶޛlJTQY*8]W#԰h7})lr@]O~rfBȈ*\ќE7A, Ak|puavbT^E׃ p ڿ̆(9@&~/Y^hNн?\ ϯdJJd [3{)|yi2SX/\f^heyVh JxI_X/~AK)|Y~S`\J`ܩᔝYU;^ҔRmx5K79gaJAlKS= wV/R]=gS%ۑRăeq  >1EXf ̙X]ʸ2%ز?#LJΩ0D" Ip}t$YwN"$قK.K #GMءj$ǃǴcnS.hN gEh^ٔ6LWMgтv"G=pP݂h-xx(̆mIxCi62ӹҊF+-h뮂g'%/t9n~G^)vtys}Cfů߸# X2sE!sǮTzJlIyt%[:$ɅI1zY42O'._tӓęw$sG:'(IuW%rS+8k+Q)R:CO5!͌'vؒJ[M=hud+Ч&bJ;Ⱦ*@ ?{) d;Kwρg8 r wSKIyK%w:=jJxr}rO@0*#Fh9Ă ({:V֕%-_ۥީjV#ɥHѿm*}<.GTqA{vцF˾؅Rу3JeD1zx&Ts_V*.k4 74]=HMv :9:f:޸zg}z s?]'wr0Wpv !0qiA:K`" "Л J6P.#LԀ@1x:tcO[-$ol%Vqezr{9jT uAPw+,Ab@ ʉqZ*# #]nI4u7K\AB!rNLQ+&U@o, FVȁڡ Vld~\mdAriȉ9uyS_5Bp刡 /+&(' z!]@G1_nj?͞ROֳk39e'%0"llImSm[Eeڃ.ݹѼezĦ@)v Hq.!w9^pL2>pˋXDH Jcou0HNra^y9/B@P()$AGY4#\mnr07֛Bs' ss 0Ln7{JirN2MkI:WU,tNrG~ 2(NUFz*ޢWĝJXpcNC'"ĭ)7h(P`! 8@ߑFہXP6x.}܇ Uamp3j -U<2_ؐyIt=r$M28lħfٷlAC?( -.>x}[C͞F!]n6_xf#B,?sOdz/韹Yt#UEhJd_( H6LȋF7]XP $ڇMA4t=g7wQv泫ۻ[E#=ίG!_x6_>x85 7?vc7:|U _brU_{m}.ތE57b7IOG{S1ׁ[$MD@IE7M gg Lomgw^6?eU<-e`٨XP"*LJ?~0?^lFYm עt"0.U\*X9>f ̠|R<*xĒŜe2 2'[gjZ[drJh-]6-Oxeu=%0N\g9Z(F.vo?|]vbwTG5L]o'q(MnY?Y=؎ygؖYev(v΋eAAg/w$W٥M :>K /".d] JC Ӥ)sp?]"Dݥ$rC74z*Hn R^B(뤤AkAq$>KiTӂ܋bsul([0\s<;f!DaL$F1Zp(ִNJ; 6.?xzXU,IuBíKFݍoI83QTY\u%Z(Q ^͙h09CB#JU,RB0cDh +n b+$*zgB !p"-wQe[mi<7A9ք_,igLRJc h&@AӖYICB\!#/rmmgydפb/H$^]1Y:*&#W#.n\=J3ݨbr%U eߥUFW}J3V]j)"DVWj8M*cڅƂ*D'U{Ղ)dFiV^8ib PU@s=FꙌ(4q#B5ƒu2QUE˅ꂰ①y MzI1 6t"j':q"AU'Fbw"mzbd%5Ѕprt:'W1[{W1[{U֖Ox0fV` Ct,P3x- ITHQhd1^<慖ɝ۾UĶ/[0hTSNBD$+g N)gAN%-<'6tT'Xٔ9n$briMA<'EKR֦ F txpS:ltv6E^'Hv"hXg:(ejSpZIu#8 ~۔7d:-IBG$\UeVNr\,8S 1 pA$u*gSz %=Al%=SKz,=vI%bVq5{l;E)5,Y;E)( v[K3u<Sd7E!К.o%~} \5N+[5N\ /u~awS&,T3 "+Dgf5 Ҁ\j.y jXu?oN) QRoS>IϔJBι@B{R2pk$ {cs”qSH3E(T'm-N+fj{{BcrNқV*v=B[g6^ޓӄ\B)Bb \h= z"LJ]1DITR ] ^ ДWA>û ^n^\+TEmSI J1^\$OhP"K JT kyf KSm\p1\0kn cxShEwP.6."5TE ]IП7o45B8eJ ;(|Ysv/VB0Kmnϖ};FZ /HޚP_iCUz0zyA/16NQ./idž=|LO/Mc,9Zv;gDሕtfRՙ&ߚ㳞$/r)( GL)FF]U#]y4U!^~Q&c/+. r=dpCfOBf-}Cfb èbXSI9[(>Iq1^*/kA1ȣ2 Z=sGxjU n>6Z>DŽ3`aݱee\?o}lQphU)$>]$'*ϔ2[lxB*"S(T vRXkNg .{:Ԥ\ 2RҨkn.M~5C I-qrg<>8gũ`HHͨr^br +}&ngT;XtpCw),&Inߦ4svo|܈ɶ?+}^P2W=i$ZG7I.^xWet$laGuVҁ3_&`:nx=]&`!F({ F"? hDnaK$R'u§l eH{nm@$|zk;MK@-p4Gݱ(hyjաjB"ox:Ydi` H̓oVM)Ι^67|qէәtTQGH d6 C]6:S'X)Jaцih:C{פsZ3> ul;3#ٗ /etm_ݙ^fysOVss[kˋď祦f82oc&K*"]ݹeYa(Xçl6zp>l!9 q2M#>JWSGoRpP*^˝ B#K̭r8°jHˑR6{*Ukkm3J5#B :1J #H]B:Ah:ԕ-RRC7)KWts{B L* ԪlzaMlbJ$d|kon:̿N|@R"wDh-hX`e(!2S+k@®w"$=jj'wP}T17$1v9hLE'O8(r63ddimҪX|:gshvѵlVJqSŠŰ r>`x,ٛgnMdŔlE-y6>¹6N&)6Z`Uri9 1`,\S"TREIY(7@ԸD[K JRl ):Zઍݭ ֝-+ ۛVS CFy/6UϰpUkOv-ℷDŽ9uX(݋s 0޶N [s:O &S|hnL>YF-hJ7ĝH(}`n]]bv *ނ bQ4;?qULx/K;eo[vO00zСlȢcSAx iлv<ؠػ`2z܊D5魰Kc0+U:l/" ympsÛze |t%_6af6y5ܚHjM d"GPUގ`DL ʶ(NPd8_zX0%\$_o5,S8sXP#k%p6f;5t={e>YP.{|Z!BK]N8$})?EO>纁6BY@OᒆBshB|:~.yN yy>mpƅzZ)<)G Ӝ&:ySR@ uZ.De{jaQb@.ْ6$[^}yd3%-`1f]uIPzo',-оa7 .g y\ O&z@dM8 h m\h8`<ߤlm27YHQdJ6X S"ɈR6vpI%f\\ND~KV-$;DDo(.qgڿ5{+MuyJ&siʋV$͔KRk-4_*x$b~[bħvpԐXe&^ݨ@ΥI#QAבxݽ`]0zF - P"N uu} :,.}Щa򢴤K'N@e zM񳥿5BJٕoT+b9>jQ"zzJKx=&=V$vJyU\)Fˣ],THPYF%KS_M9AX!{y8Pֈ)O04H2&Xe y4"hHb4G-LA$h0 UxНja<%UPWQc Sbatxn_yߌhhPS7a=>T|M)@-݌gdb7~tĨ RZG2h½u帊j喞WzrFQg`%"鯷+T/4e{rksyI:ƚ޵H6jW@jXgQ$jT+h?&R*8UzZJid'wk Oygxe*{vi"!{ku <}<;Tv&~u8 йh{r!Q_;p-G[oڮ }ػW~7(zTk|Wѽ=a%{M 5LZ.e_Y@PX7G?қ!Wu{K$%Ɠ ʉY!!:TO肠1V!?Ĕߙvi;.a)T̅37?~:qo`1Ÿn=n6w905$mЎ_W"uel! v:'dFr |Y Lf,wPƠxr|TrcE>.|;Na80/stдs3 b؎aun /o\/`5na,%*ET0tL89H|6; DX_bL.BWEPUQ\z ư`dsS %70g jRXX ޅDtϑzаfAhJVW^( {=k:l5ZۇK|1!9KE)e)x!I-JrS \˩)&1F&`/q7J"d9umFGc~Jmn OhMD xfiH,$?RAn1R!w_B@,,>5Mlc!:Ml癷XJ}*뭁RFw(|VJwϗy=Ȋ(&ȣP\; ݍNJE4X׽Mv9qKmtr!i^uT ;{h/0<ՙhO6T~}}ljKϕ'%~w>q4ZJ-m|7s{O6`RElF[+9|uS_c<+Z{u?yI:LȏёqP E3ah3?L$NYcp"1_ti[6CECˠp`0 C#xɺxsL'X攘S$<0kh(NNiP;o  m8^z xk M|qpX\%s,ue @5i1JLdNSuM#>pL,A)#,I-L3kB#Ng֚KLsd˛ /t}cgPXg_=I:Nuw8/U6O GL{^٢3ZcÃ1ͅ$O_ˋJx :sAD$'+1_ߟ0WƇ|wnlf _G#xpl0@DJ!={W1;GBYxKqRmcic{-u'-&xY&<%,II^˧D3;3\ Vc`Tnkvƪs)2AxBֈ ^D/G _x@< av2lD 4?Ypa͓> @! &`) "BN#="B$Fߵ`v% 8N9paD#6. xd"? <88%+_8eIJ[P;FqSv WSM s RY/pP<)n'H]ZM~ȴ6A/ȔHKdMɝ:bd"7~C}WAP=34G#y@oRУi 4 ʕnsʹ ly!Xhʷ<#3eqژ$l2b*<㶇bt+G #Ƕ:a_pjvhGط2 =ja VRQ,nD8B閌-"Bݗ_zmgE2elv<E1\Pr`c_?ƙ2[>7=xMA12P>Prg{`` =7$v\dM$OyR0R>_ RኈZH"-CzF 0 % !ʺvwDY0jfYE[jðSVqjs`A'\ ׽㌕t\aiYe~EYBڮ^jGʌ&Ύ,sv6 az``ylPL0N3 `v[/1m }\R8)NшNB;be[skUtbBN0v4 c[5Q̦0Rf-(؃Cj`[uStgR.TN xB.=w\=gyMr*RΝJ9/%Wĭ5Ccvsm9Gpyc޺sFn'}9$=(Y,& 9/Zl>(mtI^K^5?XZUô f!WD\}lk{Z;ԆNV! q'[ńAmU48ihdVQH7(HG \24$hZ6JǦg-冡pnDٔx2Wzm\[_Qݻ:clš;Qo0N#`-v]ϼg 9WVjxvTd]*jӦ, ^XA.^6 GiҽRŻewbsXc5YqēeހLo͒5YC;(B.dzu;g԰5=h^onPS dzksw:r{Q2M|+.C8yA?'ќKm ug$r>*7D6D C(0IZDoP0DڊwvXpy#WA3z,lj'['D]rbϸ#Cv'>9PFK8C==NHܡw]ɼO#Vk#Ֆ8ByPP'H#Z\j+~]7*qK7nS%w*1ARnFQ5yk*ojU_6w6#Pd%xOt{6(w]"%YG-FX?J~87` 5s|*Ai@).q_ID'l-u8EZvTh- R70JQC6m/[Lɗ-qKlM%~$z81B=L?_07_3+LYgGz34 (Ń#`S?}1z; mO/- rR,dwAsQ8CxX/w{ɇt{?-n; }P /4$^$ __*|)˺˗_]~qiw0xt=c7_\mx߮7$(KHO͟qUNoqjɷMLyb7م/=MA7%*`ܚ㬱Yp:gÉ$4wo}P̹Zd?D?OTKU<[o.|Y BMۗ?ՋЛWI|J s ._ۿ4<} HC4d?3Aǥ`\Gɧ[.-!}2wSZ&z | |:=On&I0ey䟯`*y[ Ur2g?]g,mGg3_˳Ҽ1?s3٘E#ՙŦ 4Kx8\GIMQ 3E_CCg751l9gSf^G Lb&ɭ_V= %?u2,8c ?d8W1z׀!nvlv~VumgK T!uf(a*NԣJB "+ecEi^pM޽{ל`֑,C !)yZq6V%kU[=MiG.劣ަhPf69fG+\Ϯ\}W 𑃏X?wJۊnVb<σ-9.<>Xj|zcp(aOq %(k7PoN)7SZ|9O:/[ms➸>=-[WWK՘ļo5#9pĶ3)K(1QUL.\UD; El?`t8-kb3/FTvVuY9r_wwcR[ ג. C\n?\* ;A*y'#7ms䦇>ru{#N]0.#\;o]u[(J:5vP ň3h-5k:c'1S̼Ə̼0cT*s˛?PY{f} &IDف1vri{l$WӮV "8BcU8nLvkkפЁ& p@ge#5|),nZ]lj7 >%Ϳ\l2d ot .}Lfz8 4{=N$5f^jʉ,<80$"SFħ:CzBushлz 1mb7e 6,{goѬ^_S>G|9th4 %3J"H/4c+$j}5OxjҸIxa䢂Ų2uנZ!FGe>ZOh>G}9E}3G lys9HycaL &RDb_2m8AQQ"A*L(x!(j0|ZEe' ra!/ ccd[lтj&#5"WwF T#/kG\N."}g Bx8 ő>i*cƧ 9L s4B( fY:aJ'TRȊ@i$h<EBGLh9'ДHbZ"ڧ%+( ŏb6=;rO0|(=i4ſMcU_bh oP] 3]}QFh .yr|OM7ͥDD~}sC#NY!ůrn!4yxH%p m D`VZݻ9ƔP*@-rCcR9ZcH%k{ $)y jLM-)qaF倷!aɈb\P8}P 7'X)6p61ͪwi)!an0cE7$JXؘW_m4MLLT9[.eO)X9 Kd\2g\SD B#*R &P0Zh̚0)/@DRED,R4[w໿pN[PnA 57' ˶9Z]8v0F٣|XYWj#mYO,T1&1;5/[Ɏ38%脁*d# E _ݕjhʚZTJI$BRpC\sؤ.|/y4D@Z c,51je g!7/lt!j,[MbT.q?vP>/W) /Ibf毻8̗8Pg{qfyō' 0&"az[:v\&V&V&V&eȟh{mlBM*ߎ=d,YUdG؍$70ǽ<Qi&2xQeQUqa72piKޖ1)239=WV}+jk洴`3~.ˈL)xڀialօ-,)Kj8kC򷰢9},V9VZl ~Sζ࿄?)&#4GDw۟t8m-jjR^#SJDju. C1S*2}CXCwh;ȇ/=N'=YVhcz\n48OQZtǨ[_XQ 2CaHs˸͹egm &LĕgUu?򬵣R>E;ó)Ѹw5jDgͰng=SmU liރƪ\J|@FR<1GQ& r}'M J_4HrPfwmqW|q UUf'-^6VkCQW+0 )C`Иǘ ]B:ʬÜ K}z}Gjwk 'i|}ө)%)g %| CżJzKL.@?TwZ=DL.@7-6i>*iAFʱ(LM)C^!]% ZU[C?6ؚ ^ӸA\ ]Pw.AZ-helnOr:~,<tP&yp^%/~I,rFYMD7V6t?:Zס¶Ɔ/$9uǢzEtqdh-? E$7 sCc:_fC-$I>x=~L ;Puv &&?VShA0m-<:xBf,2+YȬBfݐ0^E\.rΘ&%_̇~J@nyǬ!G6r-ɪ.0h ;klA>}Wwe1]Z -XcɺJ-ߺHש@2aANUjÒJ'gCT}8xG/WWOONn1څQ`0='#r](D+̋µ{u SekkX][3xIwbc#9{QQQqw3BǟDS1o=j)D3|Rf3O(#ZFYw.ȜiD@zIYUjiҵnD !D pO`kUԱ,~ ' K1JݐpB*mz̥n\s>Tm/ߣl v8ޞ rdq[sO>B>0ͽ^A3ף{67EO\&}z +. ~}QJnIn)fV]u.@e5gF*ccN/;^X#T2t1l2cӤ@Mx[9][y”@77w4IHZ>]݋5is?of>m^$mB=QB. 1I`^<@ڡyo=E'~H1b,{5t#B$Hw Kmtfٻl|.7_ 4CB8;F3Ӆ:]v;g ]N gu5U7u?:E>'%wzZ?]D K 0d  ڙU.)T^^!_璧$j&H[W3#f.\2Y…bͽ* #` - ~' 얐Yw7Kiy$9@V5!N*b}G|Z,h-?m״Q--5 &J8lX[+[]Y$>up9txP`0. Cg0R 5m*#d?`ѵa]~˷7y=KUm)U,H<O-Ka| x?/Izq#6[/jW܅ABB$DQCIiog5Vφ}Tl(~[U;b{\wdsns`IpUvtsc-5#n7޺0KL_kBK~^2?O%xP%cC=\%c%@]21*߫B{Sڙf:ʆLӑ!P9oeL5_l53`koo[m 01sYxllfBn6Ӆ-Tz^VqNSr7Jw.jD.z[n((>@gIϊFECht̩Ψ(Ԥ/3u'%~=kLc4LӲ]u\ed ?jA,0uS1LgɩJBp.V6G(\ 7ICq9Rh^Y)3o.үW2X/PFG{{I=vT١nGAtQ quJ@)HSN%i"0iS\`"&gY2!r ! =.@ mRrdZ 6W#9Q"ErPtZ ިM/ 9穕g|{v5l!{*us^g-wӿN2"Sv|[KTuUypEKXQ2mȒ@$"W.ʔsr!17f6S' ^3IvLN0vH;E.zst>%O_/ߴp^O;N"z:3]jJ [ȁ71Ž r}"Rگ+u z$ '2J/6A|qsKAthK)cʃ5Ms_1D$l`21Ib9;:P5VBzWwo}bfyJeM7,0fpMLwMbgtDeQRynb:z,zR00%5(RKМugPCR\X"X*&dSG0=&[Ӽhɟ0L07w>aw*\+@NT:7.Èy7jy׻pcU(f;qjx.`ry[ ?ɎK2Yʫ4J8yɬ2 f%cRMC2F1[!SPB13+vx,Xƙq5;%wYA90 qI̢śri/GƿoȈ55gq\]]5^VUՑbiſ_ZK~fZd Ȕݷ??ZjKSm-OMK0BB)[[J6*3AX3$ (ep Ls)ekR({=?.~{vY8 >xEo/~\\ ZؾJ/mr|x鷫Wˏak{&{2'gNֹ՜Uyq9!tBI )Fɼ*9{1ZcLDb夵 Uڤ!Hlnetδ%vf"̘z' ="Hɀ.b2TO)c'$@,iɋy2zwF R8RI"oBL"=C9V^${I$֬wwSN 4C,Xdt̺tT Dj{'!&ł g9~Z^UYDaɞb @X|\IYm+0DL1Gs,hiqeYu$cJM۠&rO MO$Fu*Y{"kt2}:GKV[AKJ1Q1AfM҂`\yZsd4 9ReKt3mLo?g:%rwt94(Sg&z,V%Lo^s`˂ ZksL>2g&5ym <RilP'\9O"aA2̎UViY$M~p9'rO9Uo"ȥukDHM$ sU]#m$S,Q?E+=|^[9ս K{ЃjU]L}Qum}xí:kP#_U.϶.¸s\ -kXףV'1mb6X[AoA~hXi=c,zJAw`2n،IZxlr7޿5Cg,ΌǟON㋧I3V!dIxIxIV'93V==Y}}d'iwkpmOb4]ZfxeXT~_nFv ikSǩBrl/Շ^_N7ܧ6MmxٞY^S9ȝ樜!U4B^-5h _W mYݎb.qc[鎆nU>DWƔSR0r˜WZK]m?(ħP-lخT L^"klrpzaؗb%oٹcD1"᭾G%1#EIƛ<| ݜ=8ŨwmG'r OJ(PJBAτBsQrVTy6LD.ogd XOzÇ~͙?3qѰ'g=yW-jI8M5E#8`xBY1" )h,eb}VG7QGS^dqB"\w+SSf_;e4Î#J("g?k*!3o~&\ɻM݊:Y_6fAޫ">Ow-InHT))Ei)"dfaD4!::8T$>|!|4g2Ƃ-Z(v@>eK=kmz% tǙڛ'ãϴ438_,(Fgj1VcejsmUⲥХ]ő{wzÅӆRld<\p&}ZNd".L Vn4CV<`@kq{ (7cwF.wjޓ&=Q KUh=Kʨu_͊$a\ *5;HW;_1x4p_tvZL'xE3 (NY䓻F QN&AK=7wZ)"uQxQ:IJ\_jL>L4LoAa4F' ՗dLHbD׆u 083-lA _r 4*lٱG&M1ɨp&-m;4;l)IW[NIΌ$omr#I.juW)EnjrݛH C6JK(! P݈&Sy*Zu^JW)]^D^b7 g)*sH_#;w# ])u5zyu<:m!+ ~ yRP|=}8WO W_!xAlg73!KŨ*gHFi-X*)¢BUDvj; B #(/"JحїeK9\aQeW ^k@ \JMP!&ue)u(+@!Qt}88JX&Y# Qlˣ87B8X(޶΁h3Udh'ϞE5Bkn:Urjș󪫌3ujslǵl-m͡pY|Ӄrz<>o@Aliymj??u1o3Q=/KđՊ8fZ2䵢"yJq ݶWG) 8fga͘;,b,E\BV\j:}@ˣJuKH$!hž; 0QOIw QɘbiO81H%Xw @b*-ƩN8q_MSLΝ`fmZMs4{=[1$ik ZgF` $; *1D7_LD? qTDxke:v,V_·r3֤οkĵ\n)l76I4N')?`j9xc aZ1> q&o@3P6(O )'Q܊XtQm/сm[􋽆tHMwwP,z=Bb70Lg;l  Gw|yUxۄCùîJ'\re4?W=OY3?G}A.f߹⇛WauOxvMߧX.t RJd°ZDƜ+nY*\'1c)F()ӱV vR}pJ\[{ ,p=3{7&ps]w%:G_?9c׊FơrJ*ۍʥ6ʉЪo:0]j+AƔZII=d1y%Sާ:J$(%Y+%$따sNtcT5ld,!)ho>$pU'%* EβU)D)Vak6ƓtN i`~Y"H 'D.4h2Pq#( '>C hZ@4L:Ao~4p$N -p~~."qI۞57?ٯ8\#\,tug@(m d a4 pLvKh9jq=PwĵNK܆mk/A%J=BAkA$,F41"%s-V`ޤ쇋.D)>鶫C촺BDB[^žneΡU\e-,UffBazJ~&RZkD D i&©TwK¹) qq;I_+S"5ϋI_p$-m6Q5JE2zLrO*n&k QbFHe9^^yPe2_xn,JN%wڒ%LP9^Tr[[6-nI0 %Y!_ϙ|7il$(") !4`I9D`*N9vm2rȇl\ɳ폥 |YW@L--_c YlsjQOqZ,l}Y#^ɇAoiDH ', NI)mygaJ Xkmwt[lBj q-PtV`[>z#g[WLOA$\Ж`vJE /U`(zE5ajqwa;V\kB@V,U9t}yb+ʘaDqW Is"_N>֭co}͇Q6Ta 1"NR.݆)e$6I |eB{{3A:ûE ^ن^ܪ:}A^na1=7Ӭ5?ҍd#c`ǙÝ|g=yWAjPbg>FmJ˯,@&\M>>C;|*@jЙ<<{}7[7KEy_/ rz{{ogk~xݠz;Vɤ7\'h5?eYOI9f?\vZ*4z?tfP2$_6iA?))dD~Tq(FRkWZSKJ{g@e8~Q@bOcٕ_b6Bc_~E2XnZ RB98k:> G_:)>aiJY,,,H"D&6di92&3: I{{KLѪe^ D'ǜ^2#)!źe`28Fr.eMG%\ZgЎg19h>ymĈ koTE0^{X%(MtwsuI |_N玌*tv;ՑtU R۱HWWbpw ڎA!lQ(06HF˨LV PXI)brtpGy+yd2rL:f )oP2J8*P*18mj=e|^4+ { -D:JQ pՍlrLTZA2%Ifs̕Jd9f )pT oOmqu+oT,R4뫛z7ꈋbʆj hJsLB_bp\sQdRXg?'4cSX:$}&0+oIH]L'+OJ5$#79k.Jir΄<&82#[$5C{gӱU AQ/߽]D>m?ОTUv=+|a=vkǼh/CUy)jHS|)d2@$Xppj&MEQ5Rl߽ @O(xՂVa-{(kܺFO QSi%ySŋv]٧TrBc^WAF%K6vzTQO5Wݵ4%!dΖ6㷟5=dT4/ E·S4ϻTfx=2F\,BXE2Us\~얷 /1Id2gRf4P[O4EL~Rr !>ˍԧ2'G )L1;Y[O grgMƁ( SN玌<|9>2f[FB)IzjJ6[} Ѓ4B Rj6TdWE>rg|,7O0  o8KYuuGҚ.=5-Snw#WS'|\q6#Lug&n>^1W`?/9:G Aޟٰk [87DuZvG^Ȏֻ6Tm 2JOr7uqpD`wgO)>o^^.߻}w/gor6ǟ=}J[P=z/ t }@.9.{Ս&􉽛ӭo!J6vH7*{W㍷)ij>O=^!*[+p9GeXKOۮQuXԀbO//?%wq.=; <"{ȪUxwg/5x pTyrs+"91iHݥ67…9]5  dOV*_~ٸ|t| {^ ɰ:VbSmU0>)l7ѵ]̻K 2u\inc_z퉐cvvYjWZ%f |s;^GP: wXbin6,-?K,#D&dvE&kr`Sl)]?6ow{_mu>|NgD+f7Sw{{|Ϣ@7TPPF]Gš!UjM 60 `61})[y\40~<6?5LffaT/RxƟϾ)U-pO~7]+_z'޼]0VE(Ȓ>Ol]3Va!yC\MJ7rsuAݬ2sP[ލbJi r8imj y.7g"S ;HHX | _wh{\'4+{@|6o8VזNoc0:VekO{|{e Q"]LvhK/7ɕgW7ן.>|0vwz7wܿ}z>{Y'Û\o%7l. DPYry1L !: r&AtdέhDJmQZ,^9&ИYEF\$MX*ze?H0%IRLLf'#g .@>? 'IxT(Y͘ӄM&2E>QYFIs9'e1ȌfY#%I" 34u>3el2l NkD؉Q5<=A 6,P^ MVIIizKPI FEZ(*5L@Nvnܹ7."zYۭo՝SkJxWi=3O쇩_oq'؝_ {J~~m?`)yms䳵x 8·їo/zZoS{;l("$^=_hn2 !Ә͡ԫ!G$G  ۗXQ1ݹF 9&]PwpAHu}#ɻuzSطuFF#qTBdG*o_\i9vFVa2@&讌*+8  ok%2`ꚥA.;ׇa;a_{`?=YOo=)#9K(>%4BǨR奿S PgU^'[ E4H0;=J=;Vsp+tRΗ<ė@]^_/K2yy3/r ~9Wia`-j^!bC!;t*f*¤}hS-ּ5ńAPvn`ۜ;4qx(籮 e[c~Pwl{Nv{ (*~O6j{ߝ}7GѠۑոӬsy4i {&2>@`eٶ0̓5G"$MMD$h 1T- E"&FI`DC6bp;cr4t('b3z1K+b`)+GcIUnjbRKsG6:&2B@Onn&4BM2/drCxa"y! =Rū^?!bC>@E ٢Xt((W~|sSRfJ-QB)x:=m4Ȓ~-q\ [$pesӤɃp0 pQw!_;8.j$QIb23oW[I=<~OipDvr]ً OBG"O=0hՕj!qFUY<2%wlMHw.A2E7P.G脎QEgXJ]AOք|"Z[\+x*4yr"ٯ[Ib_(jLB9&H'c8J8FPihZC7ç4gGG) FEpO^': ^ET^ 8qQe =c $Z+&/nqVL;-P|xt<_:%`DSkդ'Y؃!ԯ'cԾYi 1kK0JR0Yy(Nu`$Uul5 vx pwjWA%hWۺpF8oדR_#H< &,q\`kQ@(f󑡂Lv2;%LifJDj ohVʉ tL97(0?f njY>E}u٬VGO0n.p(Ez|GwZ8My)UΌr]o6 G^@gg_]\~ST_ntUr3v 86y&O='O'.K<5Α␂X0OP Ji Ǯchcc#"xƙ'fl(DfSkYOnF&qZ!oS~U) kьf%cC`se!62Ql\:C/g_h7>0qYzNjaE<,fDgDz1h\ w-_,*{@8Hv-z\CNLyU,HsPZBxBj.$[ ssJ0 2DZf%Dq#Β35R@n\ʝp㼈fTerdkZzs?i=3B6IA] @ϟ_w/r Vio&tnX 8_/їo/FtZo8MWS8NxgV7e|sAba;hgCG[$3L.FutJŝB+$`q rDpOj"(|QĹ a/Y dV;]Fiߵ˩7`j;YͦyXaˏu'cqio|Cޏ։FHhI)1>").#*DSNB) =dO:fI@K?6Owf94R 0mgl>{Y(\{L ZR$Dtf{x\5[=U L TF[%WR)1kN-dBX^+n ɪ;u?t _jtk3LLB ISPȒխ=/M q©O@Si)ԊF۝D7$I@*{~4*i2ҳQݠ]^M-|¥̱Rh'&V|^Gʷc mkNMjsjSgG\]||H4:$qhY~Zt! '`).ӄK:z=,_Ns$-Lic'+Q_g`-Y#+b_,^\xBgP7-;;'$/NUb^:@%<XTTKq*O/vK[@:`Ek>""ObzȕG6.A /y)|.[{i-mM(Ϳu@1njgT6 (ϹZh F)wβǘG($ Cgr^k$t?LtHg,\M<o*aAxw@C,@lusmTs(B'V*&,\`ڽju<\itG"/}2D(F$`bph, 2$1e $ f#'CNRM8 $V,+ԘX a<1IL0(K᱔JB[mÏc OzT`էZ|^'p"1OR!rri)*Ie'ߤ~ǽtL)%:y"X{Ճ )]t"@}߲:q 9֨M+޲f[) ,jIS(%jU3k,NxV95j]iƢǛKZklXoޣW]`Su_Ud_>V|z! ؄p*NM]}+p'9Xzroؐ3;dC}|sSR&BrgZM-|Bsi[KH L,Ҕ=+cmi崕9!TʝmPMH2^/~>՘aQF>Ղ1,\SbDŽ)=Va(5˭\DOqa!m^<,u/P?g3q=̑_F|.04{ d3w(B+AA"-wmF~ m^4;ylm$d'a˲ݶ$%n5_bXII=UKBW˾hMNh3Ն ZaY/%tq<7V/UQVKI}7kr/lFtI^ ֱn +lM&朱g#s^sEՌ*kTbqjy{^FU5uUpMȽ-*ځcK ]Vw@P&YDޤ3ټ=mjӶ݇m%r ,ǖ\k=|(Mrhs\aŲחbJ?D[ ʧz}e7N n w;eUO.^,>۽ jJVVΗzR ]`J)cB"2<}eFG/y[m ;RD S{LWn$AӼ2t em@~w:?3ռyAsA!i̒Pcᴰ6ra훛9uLlD%xsaќOyT)m-lK/ ǔƘ!Wo%8H?o{o3H8_]9%؟i)n=?U%ݏ/3$z%֙dAhḱǖU@b&*Yo*h#ia8j+|xE-Z툋zHƁF~ytp2MThD}Q `T6Ӄrq4(gj# SY\M4$f\fC^7Q!pc]e\[ZmT} ojV&m֨`1^18pѡ^{/ڗ8+RvAWĥ׮.Y`:ȩ7pP&؁eH `,xsȡAQG]4LK4_hdY aƴ+KdeћL-,LC!L7`@N+ڙeqE&-:hxe鱼6khKa2l(FR2_ej~kƂ}v@ݑ{߉H^HL"PRc;=KqM1[ɪ9-Ó`:,ЍbP¶.KzH emhaQ˴g^vcJsZm;,"TГVN,W ci`\c}>3$YohcPzPy]YI7$̐ 33TZA o4Vuu3X52Fcؠ ӣ\5YT8TCH8YwR_k͵ KaD+M,l_s*ٸ?2 27]|2+ +{eY:1mT, ұsƙ̱tX BևOUJk T+y ѯaJ~fՆO J@9ϾB ZAQ( :KeyFP U41͠U|U͠C9Z3@!T5ڌjs*^@VNvwMLadWk9EF/8vKVʖ#Zfu- >(0V S4LYO L-?xP݉󸘛S75rLgY;jh%#{ 8>Lð9F>JA pdύfὡQCf$lPYXSb\suŚ_noo:EwcYh>N3gac'wBYo*4tW8rlz?>c-ڗޚރ ov_\o\&_I4/G$yy~ˋ c럹?O/q|%:*% J}zr{'.:sT :SK(}'if}OFJn˹3Hk:'F`9(JyJ89%^Kl3DI]1-pd\r:ӷ R,ȸ T L`dԋߺ3#f[GZ9_34TZkk40j_+?_kKhl_q\Ԙb ZLdžm;~yE9uTڡZ Vgl` \` Xsi:$`[9 鬉H 2V$5#m,֌ IN՜G+xrYhڀI@+O{8[D bP2Ei \ƈəGmZV-RRuHKvQ]O+MXiix1B;_Ĥ5|}l]x?M_/X 7d0,aj=/S['MK%FFކ]Gm1iJ Ck\au wmmR_RMlyڔkfyD*OcHCr(p(Rk6n4=IlI#ܲ\P>H 6%4I.vBÖrywOPZ$ޘ޼@Ǔ"cOpxQB1cQi)9b43f/JlOqK iC9\1F{™5[(ؿ\B@J,UU*)r6r> | IsQp%*hN2$E˸P \Ipia3.eft$r-r/b#se g6BǛAvA(%51%p"piPZLRdpj׻ X^sVIGJ<>'_LrE3j֤j,ATڏ2TnZT[xrқJ{Ww3Htf|d.̍B1`'Gv˅َO{vTAmNus-!˨F+Úr#MJiq8PY"aT3~GmLј+QY6lc3nS,wJRZVNMV)yBX,aJfʈ6yk_Gq-J|;K/6Qfi٦Lv^SZ+S:oRt#)q\kxcrB`Zmˉ.)Lc9Tbbq$ NO?oʊJD]xט=r!$mL\hv$aUqwPPCҞAd*<CwrC=d eg=1P\Jn08-0sɥ6+-&Z <ߛJ] Zmq n*wIl\]MgClb;ZON!BRכW"مUD.br$RW"s"D.brbR`kdS3]8fS92M\c8V!й(jdA)2G)Q0/$(G&;]RѳۙOYL8<\ v~&" f| \{/ڗ51"Giң:Z %~gn/Kin$Y4 ;A Zvgc 'ZuɘxicFWڧ3@{ oC<];Fy(梷mt:7։cllo>5\ ?Ak! ѝxaw2D= HOp ts;|Hmug-5h= vl՞ZYۿBNF3 dz:[ aj~n(b.{Tag jUU0Qm5*BBx*lW׍G<X5o=i$U˽5zGШwzkttf1NQ!hw]={P,)4=9fo, 4 =hW|kj;jl4':-l? :XUcN IZN!'USB;^뛺G!uB{i])$X&^k:N섋uN0^zkmK 82A Fv0L|u*LZ9yAy3{b\K3Я̆ 2"X i2|ٙ ,pJDM!x&yBL[ehM ɰBRHU`%4 c^FF87񶎛I&B r7*'&swi{&4Q ވ2 "N8x K&\zxz"8G%\)̱eѽ0h$H/Uj¿4{nCxP昭-j? T] ۷Mܷy(v{cev%eց{|Ͳ:e.q?pZZK'ܟZ0Xm¾Ҽ[{A)\Л~OgU0 bB i8su [jPW'_ uٞL&˳{^HN3}* ]PE4};h* *"2 ;T*:TTTKզpta=hޗ-ݰ&TTS/5F9kFaty`5>Sm`kLsJjY 봨~U9 Cp:OLuƽ\Vno_' .Gb~;x#Eԏb䕼?yWZf=3wҢa5A@CyO~~SQ;JB4).~ieU%s5s8V p_)aS:Lq.%[>[Fm5E#O;=!#yZQLjRje΅4^j:/KiJns̸9+76J!4uP֥NC4tCL1[*XWxԼT."ekéqv+Ыbп: JkT*xݜ[+ĥubƳ5#=5#'9.e'L4J/>6S4p\E8/ ^̴ù̙RÀ%(XƜ&e{ =-'0zNV ÷~|w\[FN~/Mo~UA_Ӫvit"+fz饖OKP"gUV>gw&+  S!,0K%3RrF2yMeF>T(f| sad2%T me3?YE9|9-E=['ZzGTfk֥n)f/hnWN#˻ECM8 ,u5E?ߍ޽NN}+0ZpqP%;#^Pydtb4u>(>p9 P|%k 9 C^?sM9D`-szPKzۻ己..}L`\]풼knd%4l_%}; yNFm!Cpw]^{x ԡ/}}[ͻU~c-IjgEo96;Hy⑺H!W?G4taݷk޵ϖ&cBv {b%hS%:o;"J:v̧jI5DB8JCkmD䔾Qxpz9Lv :ʍ;'w{" ZW~HsgzYEdhC?GO=zW:#.V(SP2R䘩B;Ty:U往YW*,[lҖEI]˼#0HZKNE?5ab)⭥vJk;k6x )&'1ݡ@Uw~K_bDoEH QM;w2 vPG 8Wȓ^Ov7jl_LK4\$ɕx-J'N%:$=,:,:Yi鋇3NNraTp.2Sߍr<}2{Y:復ևZI@Drg=t}(r*9ՌZD3-X "22_r&hKͨ*mTV]' AKbJ^@ ̴!P9T$Te(,(XhJ+iUTD>W^+ >/3Zqd{VGaB>Rӏǹ"I4rVȻMׅ~|2 1z#g7r_"o8~p7^% Mv皆x+"|w<& P &4#vi6,0#b{,t oӠJ 8y"s2Q 1 Ll7 nw.c_ނs%|6&wm xtYi< BhE_I>_iGɏ&;_~#7mhx:{a\-s*d o@}=?sل1#Q1#q.HReZƑ&$U͎.a,);piYZDp3%D+0΢1܀SE,!*9(JC;sFsw>:@h9+¥d(46Vi3f?eTd<DUZ5>a )u휯.6:-(#gX.ٽ? !+v'u%'wHՙ@̪QSHpH;d kvǃ"u")+d}eCMSn $! *x_(ge$M sbss`0wfIzv(Ok&Ʉ5Y D91|nׅ7io_]9Ci;!bJ^CCZ1"_N_ehhpKwބ竧{W`)C5*UCZs-VS]R@mяai+jKϴB-Bt~aCy]8&~mN[ trR^Esbj>LLkmm I4Y7mE.e/BfZ\Pa`\,GDcN 2u PXg|\iSv6_`4~IFϮ@D]|wDwa10w*#xcS@;ֺa͇ȄbÚe!>z"v i*Ia-oo% @%aBXw; 1- [Z:.̨bgz9_!0ƥ'ln큐gkE5Fu)bɢq{$;2 OSṷL.Hx!W%pDqT2_\rw;8 ݈\ֈsX)F3cRoiېFRvm2ЩLBҝOJ 䣚_3KEFeZA BlcUj̤";MJNF]kP3ALЫԮ ״k.k76sd=> LFX }BVA,";ݚ9\];l1B7H #Tۻ #s6Ӱs4 L *;)R]|jc_%nǞ ";|'P%X7Igֻ:9N.4]+3S~oN{KM1g|?:fmׯ7N5]x}Te0k&St+2L ;M=4p]'.-pF-Ѽ >suh)FkUuFMپDc*kId|UcBDB90\Y&*XSb3̻/id-*]>0`]>R xzDKMğ~,lOB,2))s2))sQM<uAF"񎡀BQ$C;BI!rjT};5#q=:RcҝP> M.Ceo?|m1&d{, w1mk#c&6v(2Y c3ԣ"wߠtkBpۮIMf+>+EZDӺVIH b40)șw4DoGfT7J 0߀is#`4R2 PyH fSLI1$pR,kYJܪD/ D K-J+aLĂYwD$ņp""09}4x#H#br HL[}U>\wӰK<~-Φda({_`\P~*cnAnw} oǓN|$"|CSЌ'?x߹~O. ވR@۽`"Ib5uӢ҇>\;iʾ7R\Rn0[I*]w 9QG?WFo JI1w4<=4p%=GG3݆nr!»;L Z f8?&7{E $}? enBri\g@?ËQ?ÙIUbMyʨ 8R0,d h-*V @a|7N

e=x`閖M$ }X8/|3xA3yAO6+X,#v.H͊1ETXi#GFAYr hX)< g1T): ׷oeˏc 4ԣ1œq'rֿ?˛5v~ZEZiY^eyQ]Ֆ!iZM؁$ksp I܍]) ] ,ۯ ב,SӒWfǕZ1+<<90G1ttF{ `p_y\:Db~04VA3J7ܐ4zXMQp >±a9 ;g1s"8k5;HR_il՗dAIHUVjM%^Jk0mM1_s")uDCpF11bύ#cP>pʶH6)2cS8 bqfrɌ7bZH؋$92ur&eƪ8.DI [ɫNK@H jiM EH!A4E Dž7u]_$-9yC:o>\2LYƸ̤Pp::Dɝ Ap эo1yR)nb_p^!\E-dgS5D0RhmYB`e9bőB"~֒lvr*hu"0}w[HXkKR ^pE^e\/ =WOu#=Œg<ܣҁw7^oUBO~ hq@T>M7eb> x]Ihed>>F,@mozw3+p R7n#f0ZD,>Jf ^UA耈DiS ϙ5N#UF/Ʊ:jVj!iW[ iǺDbS=#8YPh ^# d(MT`>HsB)kE5xBzYƜnBL/ǥPڙpSM1~N ?8e进O7y3z)羑t{cIk;Ĕc?뻡Fu\ EgL)O/tnø؄3K(掞(IoN]RoO?? ܼ;-hn6@&VN@-IȁhJ $Pa]O:7O i!P NP#L-A~YmfǼ['gF[Я)m/hc61L;ak|~3qk?qnVC)zq;B&/x<<ϸ[o!*iG7yfMIG^S[x tWRob'NM`.Ep > z2W*L#r ;ٻ8n%W,ފpX [ֶHry.-Mfca*Yd+"0& _\$LhýLMX#ǃ.*ߠ}!XsQ"|k?Yu$QY!jH+[X Y9Yu緱x/%<(JNk oΨEa{ {jQs *2@᪰輪8M(B.zzJᰭ\dc=c[tN&@6*-ΙEek )x:YXQuHaPФ}.!IX%x`%^Eb`L͵H|{9qd Do% O)&>&QH3;:>$P *AZcQ@?K5+GHJȼ(A @l#Du(Ay2*'rmF#OŠ-qxtTț&z`1V3wV5)4 b;`>lrէΏb]k F ư`M]c)/A0g+ o!2@TIӠy愚MJѬJߖ4SФK䀍Hh@n6Z qaL9[gO & }SG{-1..q&Aa(&qي*LT?0(Bri=/4ǫ@J0@ ^{04@~{ v F-l?ho5ao҄iooBiڴV3ompkdYv{ZkC%vd?-8lm;p `k(ev`xH}(G/Z? I7r~oK&ab ).W`\!/3kyu+05ޮFi=N01j~Y]\vF (]};3<8e ?Kju笞{93K&AI"ab/+35qr7^f rfcCty}&߼/ 56&N4K:,{|gjlf)lU"VRiE?1-»kHsЪɅVQ5yu#m/{]{o}mo SOi,ƌ0] zԉ]g1um[JԉC4 [[VxtqEvwHUPRuq}EwO@iO/_Z[^~;o&/^}9t< dE n_~߿{Bm:y\fu8,je bҪ]lPm6pк$t{} QzT /X ZouOA`QHϢ`)"u-x:V@z0&VOB £D-FStdB60ԑs%\hxSNR1m2A %$S'~`&LԖ ^ !EbBj@; uW W%j b<$Hpdrܚ2TH)ҲC%heF`MF}Z V4ZQ)bKb֖f?dI\:3č#9 ak v5ʍO [ 4C]adi~KR+jWӟj1^ja2F'a?_}L_QLG h.+lENK/T4r*q_$!sս:o?[Uo:k:{qD4W ([n .:&7@5i1sf.hO.~W2"mVa-dn{ƫKwdžQZgG6eR.Vh`׻TGk#6;HL)g.oW1L Ͷ8i7Y>Ռ]h5SGBPY&!ȠƠ+FT3: I{rF`E F#U.*e:  @]~tdoޏ CmkכbRn"@dpc6LN/ߏOEݯe]]r@~wT74U}|w~Vɿqvܜ1Z3!$ݻH&h/NjsgJ05w;Ъ)n VWF/BD.j&ފw;}|yl!+V+ ,{|?zZ`m1 ln}et_6fG iaUCn̜5f Y\,V,5L-Ǜ>~gn.M.M߉Q7\ 9{ZE@[v3Exlq,Rsb])ʻ:DNe+S7*,BIh _^-=|^|=I/mVTMf:E2s`j *^lal7\+r)D{KI }\;BV$"qeh62(󤦉6MxqɃ '=FؤPH҉"J.I#,Dc B*ǽ皃 Oݰ %s folNF ء|): !|&p4Q+U [SƤFi'XRx> 0RIp*ܠ~ 4)pf!&hgNC86X:Ȧ)D@qJ^LA:M#4AjG&& i0hZ 1ɫ\)XMV˘10 :a3ٿGr29=fԆ+=Rjis$n/%Nq08i$MiIk02zԖUugAG@9 `o~_;Do|LzI/x &p %#oѨe t(%R:@L#Ȭ?HLQ(*-4% lhw2N\a%p3M 25:4^[@RrJzA{ TCJqJX)=(hA8j@|чJjxWCL85DU*_nF^rލо=e nvm󗻼(Vb䛎:J({|?c4Ϝ4fW5׈-9yǫDZِasc"[ZbYӇ`yǪ!07fVK 7ŗMSdqd7p>!ᾄWゔ!Jiujŵh˂2Gz|q5=5%:ZYCW<ʼnW`eF ѷpO*#roE,Y]}s DgVU"Ԭ ɰ<~}B[io {.>uȰǕE-ץ§ߞ=s=|im!7*e}2!+alrŸi9{J\ѱbIyb7[e6甮XcU"-;_U=X3Jh;YqcZtrE6hQr]G&-R*i͚\hz9E>&*RhzI,h ȌV] ʅ(=+eiL > ̀PF,BI } G5N-kUq=m1DsČ MSg9>zq0a&bLp :Gyv s&n|ֻ#l6Z,ˤfPNd-؃<?4L?dSErv%g7s;$oo&i1 i^|-< tĂS\鏿D%yߡVzGU )ˮ.st*Z;CThHJb[P EP:&h?Q;;-.n7yֻ֬_^fxVG]+V8‚뗗O|.O*WK#˒BնzĄV5ߦe9>0w)%d #<ߒp}Í /~|7O߿/c2ԝ3ԞRT h幎NP絉`ݮ+l{`Lcvkirgw\( T„# Ń R2wWTE-r:6;єf}ԼF鍐`wh7%u柚{ ; rJsrQ1%Cιr򆪎UdBAL=+cd_3C2]f&tdW W 8.lF}.^mL`mg`L7p0 j1Jw%=Wp6jykuB3[uPm7Q=9+n+C| 486h)m}=AV`jX eVm&H}m\Dݷ.ݝ}BSkڵ& P/nWX)!jO[|=h2@qrQV{BcTxβ%A[q_Yb*Af-Qq-d^%d rMHssl;L)<Lwr%+m$Gb0+6PO==ۍY4<<:VvAI:,J.eYJ1_D#85 h{.}^1HMNT#W!2u%-">q;Zd֑'O[W@tHZA۔Sk8?J hUAxVgոU 6Q&ƊdAi Lr$5oyz ~&+jE-%qv^q{}4Z ,FԭS3ؤL3&,- /^z{ pj ~f_2~s{<6.-ЪV|Kt==wd9߽ݸ {qir{+YRX>+W$=I+*b:ySYfOAbP":NOMj>$䕋LALSʰz$=rt!{Čѻv=R\ˮ6Bڜ| XG6[[kxMe&UZVդ C5pӲ.?=Xct>۪ d 5)4+'V0>= j;xhwJLfDIԳ%p.ߎ킴(&\ bK֌-I"}M g{1aMCmY\]QQqXmك#ܾ/nw[˧tE< d=Os/~ d艅\eˬ+,]krR FqFJCR3seO6H ˶ō(Nh+ƕi.l)4aZOv\8=f"Q7HiM6FIr06q@{Zgr:A49d%OťC%ՐUɊ\`5^1@LK Ajp2jP;Icu@iqdjPm>@aҐs\pL2F퍄.(H "7 H) Z"kG? _\_ţ/6zv5Viv*Ieh @2ue SmV6N&;ڸttLmfȖVOk2{*Hak3N8R0''+*hs'~ (N)M^ITLTjc U|6I}&Gd=a+dC 8U  tp>`r\w2Y$KCHē&#,c9]RF 0e\T*SϼL,D2o%ACH6ɇltZpo*ó0.Ș Rs .Q Ly?s5ڃ Xq2 Ĉ-nĄ}_˷#:An["8 ?@"l "P>"`U %nTyzjƒF0FV|&smmY ?GO)sU ǼCtT K,žA'@y6O:hZAޕMJ0Sr 7Ia`$k@k R}7w:lgo[t ]}bAn=b~Fyգ3p7}\tP>-PTw_ͻS?~ZmD&[Gw+Q8KOދ%!F>A/[,C|?ˏ~F5Lv?,ˋt=_ܮ}nn?xI炱?ܸAYϟL.D& % }˥⛻,JPVȋuM> Mo-+',XAIݺ{t)xЪVХc K+!!\DdX6Ѽ4 bP":NOݪ'ZW.H>Rk.ɮi'v2TzZ!|L ]^Bz}jfs U#|;xZDjib9 u| C 0eЩ~R A(a9+ ;o۪XPH2L79( zwVI,0A?E˜aJX\7Vos%V\|8E-u}W(ιg WBI\OVy[Y/@Y,%SKzoBUeCY鎁@DflƘo菨q/;IFBEǴpѡE=8iz!I ~ PG)pe$<m\#nC^ =F٢|j xe&9xwm9# @v{TVGX]0Vs)Ą5+a$pS`V1p[ )!M|[W fQGsLi4sQr Kܪ| RFIczCvK籫Q3mɉX oH.Wȑq nNq15聈H,HDRā57siq'dW\R={ ^F^x!@jWYk@1)o;s"TN)Jna1̼?eaQe; F`-jkpe8`Ү\V#+iTȺi$q[ Y3i&&0@>MC\F$[M nmܵNA4lWb'l"=6,J)R4[8*زnE\F[9[ ׁZC Am]"sOɔd+ؠd܁NَajSڭ?9M#y]:$HnM>1&+)AїjdF !쓛o8cngs/<,>myY](Ec238~9[*$VђIn<ޕy}{d{Q4[jX43l fqOvcծZ0בӅ!#1ގ͛/gl~N ?}3Z¢=8oFq(|aѷLqBHAoک+v1FoZ0}F]82CֽiBnFL. Fru!+Gb헀˼A7Nn+6֒q4mDVYglR!pI]x%hx#h"o酺̕o{V`?=JkWq{~nk P{<װ+*"0+cv6T4)Zp؛! ΰc߇1`]G$ͮ)JQkԒOZ,vjU,Ԅ]_ņy\G~|z9,\>-non(u&dBJM6fM|"*GxƜ ?7r{)|%nGad?*1Jş_ 4EqO^LԚuZ, M'V"$ؓ1$!D^6ò4jFD($hZ˖568k$K :?&iT|m9(3ejX!c]I0Dju.@A@583h%'qR_5ʁ:kYE8ќ?*d"JV & Y Gl7~.0aA(<.6$(M'xI !5|BԼ&`zߋ5d©ɈQx+6Fgbj6Rn.kNzJb֭ǚHtL8&X:U54x%FпgxnG۞KHCp^]b,tq0 |DD E^JJمЪ #u;2B9iM7! -{1?Lt '7iRt]N(I\kw0a}=ʴ~>(L5@9=ڋɬ;= MΗ.êJaHLd"㧅_!E2K޾Yc aH|忱$"Og^).+4&$Y8.6"] ZGq/Īv'W=e]d(EiDM/[_vOw/tދߦrCkJ"I |[Ƃ+k˳q6V;;ዂ3RL1\`҃dU,ē,sZy1G(/R`/ o_jp&u*j1FM( U/nJ2E C_HzMα^;PvN)s K磝R^PPi"m43'˧L$1d9 w.XR]kx*NR)促@@>/|=OOyTd6'kdt"c2aa}w:ZS= 8jlmYB1.hBWevwg}sKCO4İم|Xh|=)>Ġ#ug7+bB1":ʰPt2QÐNUh\ȋta0BYecfnx&H4OIȨU@EӌG`߅Q 5QmZ8`uĠc^nwUjN.]x/SEF,Jc+.ؔ(ˍd6`)bTyf5ϡ9PJ_wVʯ (DZ;?r :yߒGU>@AF!sĢn_r* &  4\pwG^?p'Z,q%vNq7o[PբmMh> ns( @:cڊOtcO.-n ]<5"zoh"ͫI<=g!M gq{w/OdS$tJJ( ~t-E {?&Lcq1S9 M_ƶ+|b!J/W.J>J'92ȥH1ý\w q1Śn$0sJLQryI|eTg {| 4!BuĂ^!hpݽOR[u !]d]}ЁhI }U.I˂W3 zhᎴfuFHB(}vQa=DY,4Z!BN$wM\`8 2VC&8L)F  )$0rNd. F94ӽ^neYhF󧏫ni;'ͻOv~3f7o$lƮ/JcL(Z6 23Ge<wWHc* ^Q9|[F8Fb'*ͧۛHr ,1A)BLLfQ1HfmGZӀk!'98|T~XN94`WV ,\rOPl&g2r%^"1 OcJD`[ 7(8JF)% %#2^r( p53.x' L;K+JZHչ :$N6d'7}n.MnBE7nϨkS\r)+F2ermpB|!18nSOWEAӁt8elx$n*dSx { WGܮ?SڬgRҬu+H<ѶFJ*4yar1Լ-7i}v }X@ڀ,#!ƥȳ<ǃVy.gX«ꕹUua!+*SP{sT8a⎟!GU&RǍ*"i&ЋAva7V,͵ˬQBqk\n<מ*,c^s*\_Zxx\N[mqGu BFPA2O8UXA!fMsg*(\#m[ZqF9z55[Ȫ;h-8C}`K,'`d_P^ H0Y]lYixGCnY~5 4}v6Ly4],WRC Cw>Gv?߭q7,fS^ghߌ6E6E6E6ٸyc,ǥTaOV?Df!EҏhBkFeQ,eX>.xλ6\(#ѧ80;1̷y%(M=e=~wS BО!hnBAP3SHû* S.KFPIz9P5^b͠R$Ub E)CE\< TfS<JglƙXg yM"F @2 9H1y{y9 8T+" vb#V+i zl44Y*A8\kCDB.}X`CY59E MyE!6jV,|J`NBCd+/|ԮLrИj?P O/c?[xUd@pY>-HA"ć4%*J_UA,NqCVQ>)vܔPOzP 8z%nd֫޹ܒTT2eVzsHB鯭Kgr[ӽ3ںT(B[C{uGB+Rrſ2.eR*fD6b'j=^z9t =դ.˙Biݑzuw kNNngoϾws Huj]_6AGYnZdZ*p Q ĂG5RPu\ˍt&b`˴(DDP*U҈ 62 1{4kAD+P->ĄW.*A׃qw}(aQ8>R A{J@@o-DrZV{4imvkZ[][[OХ\,8݈1Ҝ1mM]ص[M !gTݹ3ɰ3f܈/E?D>^ʳ΄NIgj禇3t\yZ-7ZӮ{sPJݴ ᮦ(- җTj~Xmeh (jl%ze!qNU}2+qqm%YY8'N0΄v6C])䦼5A;Jv.M,kH׸M$hI 4r4ks.k&jCY0vWEPu&n{'Z;Ƕm+Z g LӉ Gd)RHU)D&:n6(OQ0e7]>Snh%[TH](QwQUlQ%K`;MtІ@xysl^*C̪|~Za}K~Wn[ݎ~Zex]>ROu7 !E!۠͛^E)!>P@%7$;)<hIc^r W\m~FY6 y,ب؞F󧏫#Gap'ͻOv~3f7o摔}aiE(EMׯ|R>R/Z Ij~D1 j@͓P.=][-ʛ'B|6@gjXYri&JOG*B/Z+/\=BB!S5urӽ3Vh!=UȪj9돑]9B ZEdZEҹcaFiŮbU!P9) 5X11Eݘy0̙`5u0xω!@Y('1rc{~*hl_s\j]_\SIijI.@4q*'BhqVJ\ZOUOY-n  m*l\.!%&C¨64B0k3xpFkMC.p)h@5h>QԀUk(}_;5Q~"]i4UŋuQ׹U kI>3L!̈r׈s9ѐ~TƪbMg LG3P6rT2YMOljU1#5ySJAʃ|خV,G*v ( Y-R&V!irj WMi[K#PLOg_Äؘ!WJAi0^rNZ)}Cؿum S~])80ư {dD=\(*hOA\w}W C+>;nē 5 $_+QEXG"z=쫲q\= d$9WvnZq k.$븺HkIg"<|? dr8#Afs3I"[Ԗ6[n2j5_EN1T9쩅PL XG5W)[;2OU\8.Sz ;J#b$A zSkJ^{[X"yRxJ1<>')~1R&$Vl*+gw$!ogI zxx?ks3Y (`yα," `D")Mq@r7T6RvD' w(h P"%aHt@+ g;N Z!JYZZRu_yigi `^w ( RA,9pyv`R!bh@DQG8my:X7#xzK B P/ZDrA +y6!`n?nnpJsˋoQԌhM;z;̚c7/sִ{;jKwx!ZBy}Jװ^ޚ<_>f)e[G $=3tK *z0!b8çl+‰%ъp"\ѽ(ęj4z okkz22==dյZjӚs2:EKp()gW8X9ֽ"\-jx϶jwxBx:MG#OG#Q,t(X115100+vRH\U,ΗЊ̬Dj莁tkNB a1,@!,I[`m@IԱ(!%&*<ç2K6V[H!ctU NS:LD$(FqFxGUs3)&6|C'qyhQef/i%Ȼo>?=Ly M6@g1ExZO9#|M~۳I+4mBk_6,(lgs/4q%]D%fGvf~GSYF8'Jc9R 0 (0 X!N >GX`cY20L&B=P"vY2tラ2%x:}"EX'I,SNv/~q~mG.#|0¸W`saq2h}>ʞ~IIˎv!/-|fpMZ[*52mmO0/(m#nGUHܴʢ]dޓY;Wp(^Qxu60/.sò st;boRޤ']nj/Rċ^9 _*=1*#/5m6>LkMkKRJ)6"x]wY8.FXn~(1*^tc2B4^(U ڶdRzN(SHaQ_,4U QzE΢fzE7ky#4L+Fk]һ}uu_B֙L mrpYWy+۬Y*1h(X6 |[7&_b8u/B[ ǥ%]?-T"xq@vH=ܐ3vhq,CPerFNjU^| T+ηq1=ɁՕy)pE0,_hkv褣48+U0YM@TqD]':W)5å֒p=zH-(8;J1Ȣ{)l#1-QJ1#QlHԁ|&zMI9#fHI'TFhV37}n[.j7ǫݗ &,kj pf1$yn~?bwuG%5c$=㵨]QZJ)ĮZ0ttW\0/{D:;-WbUezf@v|wl7t@:j Gpb]tJ@V(g{QJM<\ܛcJ:TQbt;-ƶ$z~P n@B4y ctojC}-roqt#ٛm#0.-Rt}WtGkM|^>č 9mBo+Z,;zəAT;1F&IF:6b/ CFsfA%k伻JѠ5 ,m4q: hȁL0sDI7JWj/[$ &}%{3 DZʖvkB̸aW 3owW[^«1nB5D#,1W+0ϏTzX TW%)dp*"w|:5?̡yv7\Zkb#ǿz3zΆ#\c>QL#䏮P̔ >>QhqK2F˥T/d=[iITbpejbUtCcL[^r̀֙dO]efĈ1O$}z`^ՇO%]YSIWM%]U/qQB=*> :v /FW-40bbC+Sd_ŕTTxX5J븚"I[ QB暞\M g1t[zUm P{{@aFv؞i^"U &<>Lg')j.`/|277ͪ7i[)ͪ`u9 {D/M5r֋3+ tDf("bPD=הCy-\q⼰A5Zp'YE\;6XRkU1GqAZr7CG>1〝^QsA{[_)L(Y:pÝY[} x3uH&ǍIa-N88p$Hvo $ɼ8Y,ߏ43-%&5V`V~X$UO[K{'և@A9LIATj8o @OLA>&A7_7/n0LҔ$XRJ m'XSlI I}F [Ҙx^.7[Rz*-i M4ɦNG#<[)>P %CX:ۊ]^?}f UNaHc1tڢ={w =D{۔BDI(r\.6t{uER7/齠 j!_rYh R:EM8، =Gʰv2ʛ ]y)0me8c=02׹ \ 7W8| `%ի^D 1(F쓏ӳ,HtAr9>)2M};9_`DUl,piRrpɄ[bU!vRT䆛jL4 XK4` cJQnU Yb NG `nN dᒻ߁9b5b и+jjjDi+$ 攮r=~տ><̒UUzV?j%?wOzGKDȇoh]B{g_uO|ADi ŬXO.ܷx3CO7W/ܬ]̓犼zݟ1]cT9øO߅ Ǯ W#1;R<ya_gS\.O(I\ꎒRIt.H1\!܊X$4TZ݄;IEP5`cpNP7ȊQP9tەZ6 L&CG PS9Ɗ[[5njPj%$n~CR*2a=ˊ 5-YJ[ZFQ|3JԼqfMP R 'Z9SiSV(+Sq1EDZu];BV\Fr7, tzD!`t#w廤nQH &Lfy2T] !X6oezSFYf}SςsݧBbBǁ^ h9)-4lo izeg%(U#p+e-uTvpkZS(1XVnSDdCKDü=נ=:Y[.Ss/ĂsWWeCFht=gՒp`m&uvHH)=}qp&|pĂ;8/(v=JcaNf:;;.->;d(=ޝF?lKMaS(N 41efxrereCXWnI6՗D6vǻ1:ѻbb:ψn#&N[K-EChM`''FFaB+̍u^%3fVpT:cY11cCAۇ{C;`HP8| n¥S "Y!#'9B\@\aW23A(_B)Uhs:h>lK$D^mG棷t0+H2yIwA[}xGWeP *#U~ު(IU8ʫ q% X'EE8ttscdRɼSgn/ !vI}MWcvRZ B$KmSMÛEɆI))='r_"NTi! Ē!3 BPF$WDþ>5]7ooAD3= yФ᫘D=g^ y.02u DZ-0N0*st@o 8pqc9=ّ7Uy䚶_>=--o/E]ܻj?΍o-}GoK۶Cuwy32j.'K&@ЍIUqQH&k@*%8RP)f`w>?.~x;Yy+Jb[(cyng_mBI"4,Gqi+Q1āq*j^"A8D(B91 F30g ueŒŠʄJR;XA V@Qy]ct1[§X8N[NJ)a"QR7s8UژFԨfBc@#xX`!ܼ>o딢0(Z`8fYu?RLIG3e~05B/B>^ӿbJn09(`Naay2 lg# "GG3f0ȅOI"Q>ǭR@M]nME '[,K#:)S8&Pgpq*5U@T,0AES%[{*~]$kErkDܙ|7ü{x~sLJG?>qX\+xo|k+|]Zen?_]Kw˙}C`ӗf)w[=&??0 S QPP4Սín)!E WT1MI:{Y-G\О`Fqȏ3E-(D{0,p7*{QRة*/VBV`>sx~2ĔHd bx@2m3QLn B 1!$Ns2r{:W&xiܤHH""nc[GTk?aePkQ~ӕ\vw}YVibuSyE93Xk[aF<9β~[qKz5vbXW g'r.@R㾳5 ƌCjRajke#||w1P҅ Ke/jۡuݨyR[LǢ 4\5meƧgKat+-jqOl5p(-e rˍ⨊m@}fy 1k[=<ڗJ?޻ꍜ(ɂ\&kmW@/ۡ5S}FI lSP$vLExpkQN ]({K&*pz-uco)1<Š˷.E,I~op1e Q0?~@k>xgLr ==ǀ^%ַa@oy2{uss ]-3 ޗ;O[%e{wrv˥圷2eйDSh/:w{ڻ{vԐn*}5_r=Ed/W' "@\rLj4 N>מcy),qmcBkBPѦ Um{w.\lAc3ߑ`ƻ*H*LWՇV[{{4Ǝr0B u=gŵ)EzXER}tMOOO>0 v^rw\ & 4(.AiC+ Vi^Ev_]\b0[0^=)o爠UEMbUŢ@Ga0J6K; gY8F\ɲT TaBNSsC9$C1V='. h"ҙrB hQ>pMlJ ڻ~n_S|~g@:{Zt;ж ċzsMyQ@ [f./z~r"t/>̡Vr LvN8J|Q-0"lJf,?I)W 2TlCmԴu%?ƀqzb5@FFQIM ;L8EFYiQG+!nHTe#lS7$(NjgC]{l`lwWV6궊xMl7Y-!߫><_4-~M ®J(乸9IDAdL ~נ}$˝7l_!8{rS$B%mIn1"XVAYxi$2C}foHPNU sSp e]m䆱a[T7ˠDV, "^;m$kEt笶fnOMݦkOՎM}BnvjYlgW@Qrc6崱}z^b3NH%TPbA褑89a.. `/TAodn[˭sVC;TL'tEDaǩaxlJPnWk$8]&Ǎ=D'*T9* D'EgvB7Tv;I$L/έLj'áI-d9 ?6aNtHZOdX1Z gLrEhzXd39-ڰNXqEgIN8p@K:- ;ԓ#ZU'f,9}>Mh`/$ !9gDlcNs&,좭v 5(XV_hvBsF>y-GPu"ctqɦQj#X2~`t~= JBck'UW/vEÛo TjE;_W+rAV$1Ĉ2TίD0c 8`Z?%vlTNy[?E)|{seiX.Qm?X)B`1.j,*,X]Ƣ&8Csve("ogFwQ4gr>{8a(gj̾0.|ܚ{6*|El8P J[=fؖb Bj%qwuV4eXWl!/C|j").- N+Z~Y}!@ Ca*~&HnwMnH0lѸE'YpX!_֎\%qMrxLod7u(RaŪdHtBЌRrpR:(=[!*S򄑐6UȼJ$1BiI{I!*5*7zP^ֶ;sc,^/?9y!d0uZ_1֞g\(=DK[L Rg Tt9Bvː]W]='/ :^=ͯ8AJ^<Є"0œ2ᾒDD^薪ܦ Gмs%hn FXuוL"w:uD[tӱ*/W'{5l3؈<;#c`p]mȵz"ĕ AޫYY;Ysn Z5`Htv8?T]>2 W1\Nu2)+q[2؃:Z_R#ZoS9T^KBWpJ [g=TI&},G%,TORUab)bQ+L˘L(朣d-ut0C,pvbU \ė}*~yj$R[jYL +(ΕR##0RP)#2 h&Gl9bѰr=e>@MNPxTb屧@Ob41c eR* %$'3g_3tzg1E {W0]H:,S6 2^@Zٙj QG`-&}K-HvhU7?p$aDj*ih#и G?8yނޝSDPKnn VdA^ YQ&o)LpR{ -2-u"҃ pE&0°nh"1*nЄa}Rh"D¢dK"ċB/ a fRFiasun9SRKu,$m95U KE@8%~^|G lL$AvTeΦIm㉻sK̹ZǸ3 h#AS;%W,z2=".\C;[i p7e%eOpT =)YC\ X\ f:[T)O*96 J NC?f+,Ka˰d~?=.Li7Ͽ+\0]5)Sw!sO^߁Lgv8l_Wy47+ w P-%7Xkh;$1 1^F_p窾*jE+U\8.Xi3*e&Gr @K̲,r,Zd!m0Q{迡L<ꀸɸt*cʳL[c2,G'V­7MW_A^@3yr(6 #qQj%gDvj4.Zsm^n2~ݻL.y]p>v4?00P+SNqfs/3wΧ'FqL}{>Э֦ڜm~ !MՇ$cD$T0LLRC3ͬ%2qVR0 )8~Q[ ϗ/4%1OMMb %Q!36ƴ䍸"Ig'sxPN(ˮ]S5Kf5XNEbywB1HM2wXo(NbDPM;zAz=AڙCxs:T|3)%=wRU͍7z$*ja_&m%@/Lx-HL`$(*zF6U۬ŠG)ؕ'h":"eA*/ 8pKR8 \j{Bٔ7VE>K=/_<_ͧ(ݽ"ϧdr7$JpfĖn4gуC2ܽa|凟܌bUz+f lܡj)wSmDH4Å (gt42vO(O[%|W8o' V/ww'*7. a y,:HݔHpTW3+&qft,+vJxǹE/ffza ~d(Qc>3TYNIBv`Uz57$Ȏ-mVJ5It[Uz*=էLa/=r)Zx$*p 9c|sFLzJćJig3nE#' J,@)n݊` ҋ?ju8]ؕ F> \q)ªDHJ8WHRv]iM IOP5z1\s-h3[4K)Haqeם? ZY+MQ cϓI? Yd&lhb8>zz?evh2͸OjY$Neƪ. K a_ fQQ`Fͥr+RX؎F>LⳜr ^0t~Cv᳃_[8`d>) ,E6(g;Lⅻ& I 5EfF {)J>Iaz,a/ĵɭ{o%o- `8z>a3Dġ߫Dz|Q{%ŰN˕UA0JYuW+*]6[_0ěs}W9)![pr-u$5pҰgThհLWkʴL˃O9V7ǰ1+HC!RP01T&2F1 )AHv(gyFR{5W*$gU}P,Oon;`")\ahP b:;R (TJY0ؒJPI]%Ubm%Op K":'XaAV9(ͧ!}Hi{psİ莰m[¶]U#֚xq!}rgtr-*WbrWfJ+Tʥ5)Wahi_vr>ee=OJٵ+CFnWR1=jV,ǓO;=mǵwFK-yǸ,c c2TbvZixtːǙAhŦZ98d4(zPp8NH'-(֠Ч-àI%a4ik=ac&(s9U6wAQFLc#|ŁOH!<:H-G"3G+g rn1LjK9pM!H'4`()ibmȫA؉s+Yev_#,> vasun9SRK5Y$m9U.q8% 3f.a&fj4S@f9!;Ib(3%`_HJ zҢ*rFUF[  QJ –/7* ،v(OQp9 Q6 x9a8#5#}9ͦP5?IbYRhLYmؽ|%9g.iDe PpF_I3͊VKTu\ՙyZ'> xƍ[}kQ0󣨦?㸊ٜjڄQ|0df84C6 >m&ޡBSlXnB, O-n5u\ă&sdaCOa|nObz CϜ, ^ O|80X.2TY#6X͎<9 m)l: ,k'$/,=*=ϸ6bcJ{F/ {qe:\D(3ƙ=uYId({|(]Ti9X$5rgͤ9ΛK*I g)̸QT3EL-s@u+݂γPU1M[{ǓJ#嗑b!p0)LwY%L-Pc;#A)= ɭ̃0R!YSaJ x0:X0THԔ+:Ok1[*c+hOԪ2H( cRQ%x2ֱwMaQpDyIFPm>q8CyQgءBwLc(|c1ETKw:28!puƪ>XՇ<8IN&M4[şdJWRx m,M Ψ*vܒ#+į5}C`oɅQδAf/Uj((ݻ?sI Mnj _;Z?O7ȠX֠Uȼ`(ZJnƕp/kzof\it5(Y bfSn Jbga?cmr7◍~W!܄}g?\NQɚgcZ><-?//~\>(}Epo=\l?Q/N_H fqV?>Li|wuꇚXc׻Qҗԃ8:3"qgւUZKg=5?-Jb-1D#̠`s51b RN{ Ayf)V6g4EkKfGgaQG9PƮ&hދy>bz/n>} ǯ]l%-Åyw˒|R i ZUFFK_"cبf8o(D+\550ݚ$%%a!!k4*i9/@xijهOp42=ϻS TWц^Lz9*My͙A3`sft9dsI\#QTLj\ԑJs (<h -vgJ?@T&jNor@V4se!c5Ȼt Nt=pW}Ae0ȧa@0eاcx$;zz4sIӬsS+2zI}(ӃK7tT5xOsJkLI/Qyn]vO,'NlJr:Kܩߌ.SČ-]IMi?괩9h#T^3dPp8)g:LJAi8s6ai ){^b/߻*8wqs~& sWo|šhsZ,SvH~.~!󕶛⩾Eˇ->\{sK[? W;[݆$UǓp5 AꢍzW/K^>|,[KФ1ql.FPY pqvSƲqH(C(%LU,-sjf(Yc\7M}RYg7FJ$F3>X^4ϾFWbn&n/0r]|bX'%[K5h:1-xGQCW]ꕖtB:uRM5nB^+Si@SrAP-AK[tPG#J*E9qdB,P,ҏn3Wp+b59v.ך4Pm3F *SG4^FS)$ށ{e@5FIP T BARup[*u]h ㊩&Au*eWx>;T7&+U-*5@{J%\YASŦ%Ese=Rf'@3KK{y7Oi7!\ùd-΃``WM~>?Y@LL[Kg\313Z*])P4̎|ts&;g1 _Ml(έut,W."AG,|:K8|baP)w@36R}gi-6AީeNb`Ald>J&B*>[_Ĝ fLD#ipJҸ2`>!n9:-봜rn.lN2j.=j )_YITdqbٜes ?Y5Z^z&ʝ2V\FI5MV:ȼ1z>c834 48ζdaZb4 UZ܃e3CۀQWpεM|R2Xґ`+a">I浡U|cfts<8Q+fQkf Hfb(yjT2*#v?1U6P%6Q1YT2FCl4|`$P'EM?㔴HO]&+f$.#)o>ͼ _fZ?uqgђ?$f,n=ۇ$5=r}nLx|xZ~DsyP5'|諕I}jNtO]p˗&uxZ$>Ϊ{-64PZJ|S4rT1uFC?]3kA*I%x3F5:5?-Jb-1D#ses#1ۍPin{n^|B7*\gn{iy9ܳj`ofz/bϏtu:sBHEpoΔ1˼OԔ34!3` Z4K4ڢ+pGgYlXO۩֓+>6bSs*^8~}TEi)txc[1OGa2zB&NclGeR%`)Ѵ|U-yHY2AgɌ u\Xb7;ơŰRJ͹7K[6]5 ;Tg&詘BxM~sT3%Ĺ3 -Fֵe.uZ8ղ1Aaκ?:"bFQSnU뙜t>^_ c>,wtWRƵI_Vf9 ?~c32CzO-ӱ(j/a5d3nhw}$P'` ˆ(WŞ4WR`&SY7[)9S6ʔѬ[yeuCCq-ҩFuT=}Lb:hcyfǞ[mLև|*STΥA'q /j;':}wlҞLJ;jY`?^^fnoRH!+$"OйVF6nϿQfQYN{`e-ea:Uj߾H#ڿ XS\ڼMhz9?xWI57a3ޯo ?.v{rȬ9jjމXʱU+NUdVV 6xd[~Rñs;~{ʝf!Z [@tJX #bL$D4 }wWY:0:j!ёTe@3Ux 1A IJGB, L2v׿a+HQX0D57FYblGMDGϽK) ʫJd)h7+ǫ`Qj, HPPq;K (Т,xܥ:ss"i5QiQ%L@3ᩨ$ 6P&Q>(& qQaZmQ3N芵0*Kn(z?7r{[p̻3 yPxE P 7u vkD VkҼG);K?b]sK+`H #哣kWɊ0Å?dq_,|˂1{xjr /6-cVe&c2Ѱش[p4nT%n/POw_v[d8 ٩l= Xt+X{j7H=l% 1 B4%ݶEBiWQ(`4n"6? Pj' mLz_5D7:>A*F1־8~Nv )Y7TVwmxx޵u#biwgOdUyxv0ٙ}"vd˲)dn9-NZ.,/Uyi[2E`bwFܛ5T_bl v?,Oy(E_\$oC8z:۱2xqoZ-!70kR ΦHk\7+q}sN33W0\`""bԔF M@&hs¶/ض\?˴[7&pf~le.ߢgo7Os?;QdS}49MOkʙÖvxŘ'>QlӲy@Y&V3\>-QMDG{ B"y3F EsR/իM(XS=7 nYo>g7NJ0%=/LǞ$ERF 9;PC%Xۍ dHզ;2%wΌRG8VT:䚹ўӜU:XϘEz[KMrSPZ&a-O@]0{UBhL;  #Z./߄TԢ٫q @YĐd0zry. 3_.rGNAfW[PB!A$C`+Y \ZNyw6v)Ƣ3 ³-Ƣj&/UċC>sG~C;/QSm@.JTiBИk2 l8>a)v.c4\.Wn|+߱VI'΋rlƚZ(犵CFStVE?LФcm:wF2}r޿w511WK m ?\L@߳=dyDZۅ֜N n2& :( 4'z#xSw=#lK/p)D᭬H9Z>Ns)&Eys3“6s[^U0Z`"~a0 ,jd=/{FExo@  y?yG p]s6(|-!fPC, HF9ay?MU?yw1I6O,eɐX<@IY~"4SIBV36;9$vGMZej!kId%ێ!BR-d& zm{mj]Ui_xdڭڞ6,g*bLc-%H Ј&@h '85wx\Fw {2jXX(2 .$DYPNJٱQ]/{g S)c1FA@["?OmK68c γ-ؗ]&Wt$ZW_WN252$n6%+l njaO^ң% WFQ""$1mG @ Ţ emTv29]yy#3(5:M`QƉBMC NA&aO]G5,M\[jIA[1("y&x'.K &ګZj-o듓􃳸,ʾ\mі-%e1%2bTGLJN@m70Tqd<+Jhȶ!5zSAZ!X&FE-r0FQAV% &"tIJ$5H؞Z"3Ř\Ldq, 7Qy#b:#kJ `!=uCQEf(MHt3Ȏd1 Đm-'Զbb[ gچǪ̌JЕ| i%)J IR !+"j딠lP(ȨapRXHHZxa1+Sv#Rm>ӗmS]ls dm6WQi=1F8 ݯ2,ͭ#hy*TPMcOZ=iMZ[~ý70f!gQYjW?Kk>nӚ?5gV dV.0l]٨zG>G6m{xGMHLxf=()dޭczь&g޼^>#!yA[w$ }M^֛d7Z܋V7'? -49 Ƅ:A0&0G<TݱxvGC;;s="'(^yVQìB8m5lxz FySdq[ SKHY쉭68Z+bE_t$:h\*"|U;tN7~XG-(ajmqF_Zp[3GS[w3ufLy8fuᘬGYnvjKGͱv+%h|zP 8;{؃SNb>fXZ5Y)G=sk:Su7owW]/?Ɵ__vR83C]cҙ}6`~~eQOBNPW4D+h`9Ha_hp#WAqTs:+ [ߟ)f  V%jw[q>cƧir> .v;F{Um]p{W녲p*r:EكYWVt iAI1)~6Jc)`:͸fth4vysa Nʁg,XNEr ?Rm|3^z".iLi)EfTmĞVUlɆtJWU=>*)*Jv&&m 8+[EEdC*!DCL*S-Oϛ ;P?V!JNy,kBak0 |Vhg=w3к`2ɋ/Zjm-g<"Tzk8W 9A|<#Zeo>#\ @&`PhK#^'5TS֒Vq8t@6F`wﮯ$׻\uhu& Ja$Ӏ7y)xt:h=B$ј]AARH:۪z.-up;-u8_(XVbS=x}IwΌ͸PPjtD1%TǠTG!)(a}Sxn[=Gi.mtff<Ԭ|;stu2Z,Ht>KܨY p`rlLuzF:Mr"9؅:jI'yXF2ΙŖ G_/}ѹbsȀ7T*|X;pUhyaQKu7'FdTd+} T#C/r7ʌʸjb(gȏf\ရw>-?.13fmfCO n41d<=}p7\d=t'F2DӞV}lK2{_ַβJY{j}l讠t7_tH (^]n|/sRbyĵn^JnP'Q=R+vXJY7ZhDƤTrxZWf48Hj%g6-?fHH8RZe ɠ̀.peǚdx׈>/ gFXOO_Ny9zЍnue$2?KK͐9PNί4G=^*ŭ87v|`2K afhVS+cMToGXDV ȁ|@v`|Al;>-S3ȥ/Ԯa?N=&jB ޹ Dq܇ cq)6 I%vޭZ?ҏUSrlB:=ZOGعx)/]Ix>YAyǐ[?zu1yeM@{Oߧ=CQQr;ӗT@ebt^FXWgROoorߨ )o"&*ԯNVnyRlw#<é8xr&k<6ݸ~}O=2w.!e,]E v4MxOHmJQ+Y^2lĹʆY]p~RunJDXrE-c9[l_:/.l|j(#{Mke6D(XGO(JdbDi3}Nʋ8[{Hv{M{Fk*6},*{׫O Ix%3IBVHW'R [Yg+834-^srfwZ`:J*U6`_&X6H.H `]#UXl R3 )'jHweLH{5]"u2q΂'m!E(b{B`v -v4^KZqu6ւc)=SiQw<2AXv3=2ڶ` }yw72cV_uߜc!v۽}W}nkHSjx7r ҩT+}},/ GҚt:n~F~ֹSAQփK:rH Q_8YKLl6q&g]o0֧ڗ<$m΋iDD?j.FN`zh%{ӛ#Gk#"ڈ[܋]fS=饘$WrfP6kK]lMU:j`渪a &k [\,3;kzzRQ,O_ƱwLe5_ߦ;KhҨ)ɼ҂UrYLvV^Co[ۮԨNSJ,}K446Mhz#΋<;2S9Wuѷ IBKUcoY+Жs.1GshW{)\}0?71dOB sswbTϕ v<|P`6'HiT`KfTn3*)X?q0,Vf)yV U-`iC.k6Vʬ2xte1Vag",ઙV(59w)hsp[w̬Z< kB_G&}} [ʩ0y?p.AN{> n`Ę˿ǿ|FA9&~x4ZuI_>.S\.a9j'u܄~d:wVV_0%ؿc!w.EbO'T"9<Ęl=/o>7Ez2kO17]:g&:2-y:e_[C)V}`0$sjM12ʝČƙ5'g4y|>;հM hݠUbdX9TOh>V ᳁2B%?9@Łpl/hOWѼg/M!s]"I%`DBeg`8p#zDBYb&NCYqΗn M0c~Nrځ fܘD-&bs"M8msIHOCVCЊV ?z2^7d5@󻦎O<5uRT bVF n]]{gN]qћhy(ڃVݡʚ_ʬ-B55=2,4N]z| Bu٪kD95ARw/r;ڿzGr:[{]d],E2fM%obδS;w]玽)u; 2TdiuM{7֫$Ys;s*jE L6 (&/7_j)TemoGdURGH^R5<*~JIN39._1GR:mn̡{PWiSrB`tؘSdsl'Εaʌ)%֚K/KhӘDN vl5t@۔E-< (%JN*kLpX6l/cyÅA`{$Kv?ff2 %v姲T-]Kvஐk۟ Ӳf1ozozrRW L"[uiYVwZVN emy 2n?8 +k)[P5U4)4Nf6JԾ.*4ʍ_[= GRd8szb_%GG:]h²Xc^VYj(%( IY;nZq|~~U;RȌpFKQR2beb;Wn*בt=nbԺ~=EFI#/]2*ތy 4x;;E;^˒ͣs$K=v,ucȯmtÌNY;|RT;kf]pN9~UڬsWhȱ^W)w?ZPqLD wC|e}!O+Bt򘽔н}*cXi ݮelNK_9|Dj8򊩤L$F<7܂G5v\Wv]-q !k&2 燜%y6D:Nf 8F! W`&-NX)Ǡ\QUQs;mD0u?@|3V}VXϕa, =B϶*"8bA7^doemoгӏ@RjYKHгuPۯOmB4ᇤA]ChгaX?PO 89>j.ފ2#)%W)%7h }OSl0wHR}UYRAUרuj ]:w=Ocުէ'٤dUhG8T!IGr)gS!GcuRgPm[vmp[Yـˍ' $,*3d (S]fH/L@qIBAhoh 3`Cg4o<7:X97gT "S.wg j>??&|_Wi~lϣGeeF A j=ny!vn{ҵPt9) ԴZI&eX0|C[9b 3ٰ*ӯS`]iaΧjȳa_#Yf*l^M=11=9yrPfY'?V!~!Ѝ'C+lQgWwdɳ NJ^p:@ ) BBK 9F]1(eTs p[m$MΆNtqV*3bBì r;YĿ-9sg* |kvRTd j.|pU2DEU/gQCFAx'ʡh\yV\{oʝ2/ڄ< ] 8PD-.B@f"rs(}R.op3/YD~;5p5)2ع Up'?)ӺA-҂kuJ ZQ+7 "(?Zie?Jr@r߮OڃdE%*!Vbw`rE7;(r- Q'[  lAu?{qOUބ{N>S[ ??ޕQ4JOCZ/#\ѽ+r {^N3L§};(֖``,{d}}e~JA!{w*iGX6{JAUJAM9cF/׋Nq ;!va9ڜ<,z}_Wv"N# YyN-khvӯMfgՃ>t&ch1A]ZiiGn(v& 忲IotiamVbgKU7x.O:Zl'wt.UZB(%gzgݿ\&4[RRc{r/Awspc3/ݗbF{Y:p`؏䮧1+8'q6pkrg h+;|\n܅]ւ@Évi{X?,ުşBs;x)=2BdM4:G52&h/_닉U0*ɕlŞ'i'WjTIt (1LPYRRDe=8y*hZ:P}3i1i1i1i,P4-}TkUJF7ɧv~`{,W}w)mmlrZ>G$RgEK Dyi< lb^'NXR3dVErV**t=vX\]E]rzɇUkq.wêӻjvN4ahP|d <[Rלw.yڍ+,eʒ94}DXGU3KȋBq:CsP  U^B!hlb>yy˓BCNăkp^O(lV)3H ;O9SVzJPhv!EyLpLc<&qa9FDV7Y _9OʎͨJpoAwr u%=y5t#Jli6A6\y+rj\L !@wX{[FLNJ|ȡto6{AvۃI;!Cņb/+Onάr')Qm<*%Sᮎg&WGhŻnP`˃=+V!/.A@qjJOP:`%J~mF@Ac iW, ژAJZ u@KGe+CQ9vԆ],5enCr9ɫԂН`By8yT(yZ<-JvܽXOM0+IK7NY)9'`;aI`|Nx X>t[k=9W[Uc+.-SgAIJ"O_}CMCh?nk2- .VWJ TUj5])B=ZNV|%;V1>yQ586\6y3 ~ȸ[wDK)a(A\rV! <#RDm!|B=K)dopk^O?\C:ĥ2t8- temYsҭ|; 2ZߐCUmLf&gyWS8m[߱`0uQ/㲿<"~.a7jԁRIqBÉeԊ9|P.APuy]ϧL*g#yw/hZ Hw~N3\vw= tA> 1 g; t yJFpƕLWw7%[OQ5<"I2yp,Fnh#3(] NRoۻUC] # ߛy=Ù1wb=d4!tº>+v?U'|Ns7.gb@qKnIPReky߱Gsrd\RJƓ|QZ9(M7[FY)gr6> ٟwqSs\i(?R@ ?rp.Mnis*3c(C\ւƀ6 PN#7r;a9Y^y p% :_<.&ciXdqA(J$d[M0XR(,]In^%O wy_ʢl`(U]9ښ':- )HC> & g%oZP[EW֐=?XE9Q1sIC*R]/CZ䶓?Q!u M޹xӮoe=|p DO2`:$MB|ӖPM( e-xh#5Y!O\.01;E]RX%~q@ s;`BNU*{f3_]\aɥA\ SYt?Mp3pW#w1pPcH<6R'_4~W-. {'D\R{6kW$P[su-O`*k=CF hGZ[6#°G; T=Phmzvֹ˻?lN6>t"e. 5ANN`a+?\ L*n􄀭"xh[OK&@Ȓb!hɹQy$(pNȂaց.~j5M$6}GVѧ0'ArO 53s0w!E+P=#2 ]c=OaA֑q)ҞFI18_<0 ; 3h8XST,&Â-ARU.csD0<7{6?;XoqQ| s[Q¼VưĂ"3z'(?0.zHpbs.ovFM GMf2|5w%?]y! +)5UhLKÝ0Z\]J}j#y!:bʬuT{0+H`ijJFzX}{pՏ}xƢB$=Z-Wf"%qy'aI2(V]PlvGY!+RN{Sy 1(Py%D|sw5rN)c e Y/vᗅqr+ıIXsYpz/5Qlsumn Y&B>MfQPgd5Ji@*1HF3MeVꉼQQ=|{x/ * 3̴d*gUŽ0T* *9|'B!r`m+ eՓqd!%ǣ \[p)c ?棱Z4)0ec W3s9>\L&<7u\ xq^TD=O]^yuw ?Г.$GUvnI߭=on7mIN3 neW;S31iq.ϚۯWzݴŏl}w}A۬Vs$N 1_=yߜz{N@Jy0B^o 9evKc65&F@\GaI"fv!w]0L !#%έM3njJ`GZS h yĻ+G=IW.U2*0UbO ,[- JD]DOW/n}HW.U2}}& vAԾv;? obLTJ yǣVVvy$~J!|<$IZ9sԔ)"њ2#3%@yzsƾ-طE7߾UT26bv7S^7b<_7[[٘`&9G$jv_F#&R J1j8_KGVQ39kt^^5j-ybTdd)XѤn$Pt<aAH%MjW7K'#*-'X9 20 XyѤ\`i𦂀}j_Z>j8kqgeHsTys$਑'ey$J9prR9*f)a)&Hi*A g.%w*96shwBG@|`׈޺էQO!S!D!'8ik,)MLx gL8OTKdIXb 6ZZP(m8DcWӈ22}&IG4vQqí ` (# L*PEG@jy?|cW0mw)0؀-&yp^?;fh @A&j(0$$\WY~wD5)$/N%)1)X$&(׳E1oS$}YrQboE9XAƛm MuL_DƣSsbXL=7W/kaecOY,>D1/E&"嵔*Ky8`K4y\<9`qbax<;9njA#+Z&|<KwܛA `Y6&B`Wz|$lF%R{2 Jv7쒼`3kچ83/|mxg%5~H)SJk!YBYJvt ?5xEz'ܲ!P b@_)"^'dL"HLh8hZ~6uAa.FxK?=C >[.Qɗ,fWvFQ7|ݣ‡ @p><[ <4 9=̗3@ΑCF 2'Gό%hڕ\ɮ'aG?Gr,R\@wYVirI{~v.$H>,OOO(u]Eå2{`7 E^ rpݣ֣ L`Ji)iȼЪNrPާr1G+ IV@H>l+8 }- $f \E/Ig޺Q1;zMԺԨrj]wdMt^Dm"Z艘8O^;l,t{g{72Gk:5>0wYOGo-)Ϋ>W wE;)X h h;6xQ[xe%0Rcs4$RmQ3p;yQ]0c=DvE_0`T=t[\6FiV8r[\V %|_>%WDWnOXktcUM%JfKI=&%lpVw3 w#֔4k`7ݠJY/qHrs\O{]_Zz?~~Uҭm#<]0;*AQKkkk宑NJiWP֨|_yt^QIO k*'^y5WTN[[+J<(e z.#WMj-~ L }9o`i6.C?]^Z%6u96##,/$&gK. サ3pz>'2/n5x<{ORܿ?$y`@(3ݦ ϫxR8E!sh-rKVw>ҳ*#%<\S }nyozkK~P4kB>ʅb7㚣0!]^=N#EkOa kCȜx}m~tr2neIUO$Kw]no?>%9ϳ 4o̹NzzQ荫ȨuRκ57#ݟUuak?#˶QJ[FpWM5|_k9`m0qBulW ^ vW"Nj8!Z}rK]RlmѧTJ¤T2HsD0>}FH¨wp{%A9Ge:<@RS$G"Qr!JPc BʔcjolF'a\K†uF_硸 #ۉ@y4Qj; j"C !I]O"DP˃ {Zg[x8ۋcpJh [q=P}'t*{oő,{>@[T/pZQ8K#_e].GcJXf4w1pfڲ0j;pz ߗ8?GRǎR$OJ_x+Rx2n't}Сc %p7c_qg6Hu𗰆Z{ȱ_1mǽ|O3nd;`^v8R媒JĒK%y!9L )Hp7 a%.a]BZom#aC2oo=k l;3ѮvB>Ղf#G3;9B3EW&-U;_ݓ?oyuC<;G˶$vm}>_}~^҇_AOg%m~[zekyrn!<˫-郷jkѯVEuo5 #l yJ)>&k#{r{]x <",^>wp4S;ihZjåy.5`ot/봄dX֭$UCn&QhD'~zVdzgv3QE~8OnasWE|Ro%}WS_mj_mj_mj_mcq9Y3K~x׳VN<%gadd@5uͦk6@%1pSQ c\DM.ؒdދ ZR$> zcκ6LWisw7g&<JqpNJ))rs}M` z[vkϝVʪӐ y=FIS|pyɘ1X&(55elpvJuHcEY)hJʨ:odL*xvkFY @ő5WԒsrڒ:>G]q:!6Hic%lwgs rc%*AD@-*<h r;s8P[-C)JB`WG @i{ovP_nYkw(ZzW72wT mlܭ3R/G ucP2"O%l)CnkAZ#f(~d0Vm{ VUapMٳ1ŧ=bs73KǤPie]{дJɃ)T%TwWK'څeVL:ѥ@;t]t郷j $')"đK̇kő;CD=w[Vhg4>iDo n}tW$V s&u/NQj),Rb**r`*Ql;i{ON[Q'2K*0R joLLP1y(+8W }jm=ھSH1o*_uN#RRP I bZBx{V[kEq}Z6%Ԋ@-ް--gVv+r3˳Bz/.;=}γL;m-)3rU e-y^襦$P|sFeqkaܰe{oڍ,ڮ4oM} / Lm.d惖k[4_yZf,K%F63z crĘaZƒu4<6`%*p$[X-1Z"؀S.XPXLL)Gz"HzCWsZ,y&]5\*1ѐ1+e"WSh/vu8-}78,tS1鐛T罛Q)I\Ih ;@V EƆdݑկ# GUʣ5p֨1;S~P0lZ\c,"Fj! \\/v;Ď\.azo1˯ăg,;mb,%[ &Â")_<~7reC  u!}itT bFS-R U bfWW7_&6N /XJRLh4;`AzäNh#!zR+3T6hk˚a|PM)`4or-gF,6jJ*"?d2s6sI 5R^?WLqQ cIJ]!SdM.R3D&(ԬeJ!P<20aX2@%/;bR,D](ow3VG3dNqSpAۂD K5dJ12%9 Rvl9gF )&s;ܟrٸzD,Ƚ,Ƈ@8QX/~9Evy?)%PDHN0OeBq0Aڱ_e ou  SbF5urJXV9ҐO1f2a0%Ѱ//p0b@(|.W 7v y"%+l/ob|D&̛`90EE*`bQ5$8:0ӛ,I܏4.a PZ&dK4s.J.91\Ϸ %^g~W&J݁9\_6RN1uk)y"9̒2׳A*9VC`(l>5serdJ!p-e z?xOWDkJ)A ).L_#TbJe j$YT~o ~ikMч%)uC/R>Yg`.L.s3EuKshdg5Z$|4!GIN:1ҳ )L2`ةh-k7A+Cf7Eul('juRL8{~/eL K*:V` xddɒ"_@xPBo?f*j2݄l$e9° L71]"[yqT\_FQ?`MPtNzyP yz3Z%"qWf&+hkV"[{#i;8 :}s`B'R!ĄXxbo}_BhmGew b ulE[劂&H`1sNI0I̓GBvDgY}#i֩dN獰1 b %~.?-mK,^-dLmRGLCٝ2Dx`@2obRs2kb&[%z>w4(8(ٹk#q4Ƿ96brZ\ߚe[#N1=g)QJ[ɷ qCH7"UCDiI_6Dfv&3-q@WꗨK$P?.i%obIŔd <82gmb=7E#H#pd`c$qޚz.he8-R2!s\P]WYrZ"hbzCȐ+- yMTʦHǺ-ֹ . *exQJ~7kQƞz:xxk4)):` 81%^9]%^{Kzi?-%vk'!$)uouJ]#6^^:mS`˜N zII锺wZp85R8zw)u.`)AC(AmN)g-%OiR å/ dY)#e>µ(anoiRB\n_ X=7xTC1 !ZsP4 ܰlPy oHrB\ ; Р)6djSAF0AB |6Yv h $mRbfVq2(۔ yE 8YԪ9%kDzC. q=)KQ 0*JFMZBqDO\ y}X^]1N>Ϩ~;ӍP.Z -]%3%\.wSR%ԑBn3]4\;tsWNj !95^l^滹 - O}춨ݳyxWqm.:?}x2[}vl_lݣ!wmmX4 )/Ve-Tj'ˎ2RD$N/MQM4M~pdqp·\0nL#suZ_yk=CJUd+׾yF[;v达I eAP,s^8# L<8nHYx%=k}_2hu|lhG ZhWL$/Ҙ!bPk-JRؾ8} \b yf>*Z[uy0NpS?=tߗqf-TkFo ԈJYgZ<2 SVX$Fi şNl] ;T;w ;kâwz-.w#e:FNv0[%z8} ro_3dA"eQ4Leh<О{j>K!] 5t>/ɫI,\Xt-6 (1 pL͙* FL  (QBAAY.uδZ!27GY7BFs] *TA9w;a (TrљSf ݬB-vW ?qH JT+P2cъ ̡%&cj,6@g<bCT* dU fPm?#x:% ΀|%&CT>P>\ʠ)𑀯dw $nT-m :P~K8%OU )_'l?[Pdp[qn0NȭS}cFs +.s}(q|5R^ҿ+C'yf9{[},w/_\є/m}Y#[_f z%O2jAȞT'3{oTI+ C nL (=ER{{&ہmT,hN{Fɡ8GT7-wP_u鉣8BM+](82tFI8"Q* kJQ_Z2D(=iF*(ۨP-MܮQvM) ]T]2{SMUK]m/_pc7E {RPzhcB/x8i{}%PENJYOF+ï|ѩ Vcm$ @jrv;:sp92#HZ{ ͱFh+?%AEXo?pXHjQn:]_bet=8=7~R^}K?CuCwUkgL6th7KNL-iGg`RrK6rJApY&1rJޝ_J7;>X=u{7ɇvfd/?rׂ2 aS ;Ãqo0]3]Ȳn fZ>9?\E?]' #v0fU|[b/zDWݻa $Fz[eo+ f/Vn0=D!Iӛc'fުӶfIЩ1 .ApOѣ} P>gF(Gm8 7f;WV{F<&0hY|ap Za$"R &qjAΑ]훨R%Zɏ-RgzpL>vſ=;mE^lr쳱&OdI0x+(qKZNz+ eCv+TKpvr6JC)^޴!۩/S-I9QAJ1a)P-x~S)WȚmܸ=NiR!q8!WܝkbL' gf`wyXLBXԅ2ɘ%+Ve2Rzdtڪ[i?p~;뮛Ӄo$V"5WXr4Ydq׺.T4CTtKKM 0Q^X$X#"_E`H$(nbUT{ m|M,uNMF$AA٦Քs'I$AT)DZsA$n>T-" pMFN 9jpG_C`JHY(Ť۳HPOMCuf&z(fXŠtRm]HbC̺ly懈Z-!bN"B}Aռ2c[VOO[|hF}CWdPA]w];䴻 | b^_IN8o™_UkKޓHYGE g-(v zAAVP? :>j^Ine?k̊@ =w}>\J@ YhשSI8rц=(Hxb}T[³ #O<' H-1:o*"B<]2Z* tS49%97/!QkMT@PgSZeJvHeI $'Zd[ $`Ij̲PĐb YJ!aiw\}`yks>~:oE*7{1qd?56WӻP;P1of-"}QSC̽m-53^ʕ\ Aseff请>0'-vZ;fZjIwxkԣ>^<|}o(g`_2  oO/!|0Ofx3- SM3Q. Ͽ_Q!{K 1sƓ->}c+ npy8v'}?y[p>%w ߢuKg]R@ҩDĉcاO3,MWӽ(6U'y"Ad[tSwO]xqasY)+0lDNۄRrM= *_|(YVI||nFs8wi.!cwa3d7mQ B 41#+\ #1Gvb6_E hD" D,Ff_'-'Yۢ^z 4kD)|&d["^f"TJzx i /S(tM_y$[uJ,13@nQ繁LHƹXxʜB$\vU"P~\3-BeҙVs8 gq_SV J.F F7M PA i1(#Y(/u!:YcƴB19,HVC3+9Θ DRw !ڭV@#=uS'q-VdA`9ݵaHP\|v MW _`)0(^} {ޯ7 #rITNR`døA,s #9pw"06!pA|8 2Bp!䌭b|*92>4Qم0 'KQB x7%v'dXlh[}|}̣C A'9K(+F(ٻ c8\^‘)1As|*Kx 57wr98m8*T` aD&ิY)l&GF@*jYN8ˏsBb*:UcƆgOmS>gQos^\-/R()/7C;.>rg5ۂ)_.)5~ᗗdP6xmw-8?mmޟ|YNd'(KgwMyM”ߣW \ jT@'u*ڭsGn%"[yMG'FnT jT@'u*ڭ[A~ݡ[zMHVB^8DƔ;z-,<^Y3} G{wT zT } Q˓̝>-_lWWEڅ<~)!5U3siX cck8"0vʝɃ42[ _ rъdۨP-QU%6&+:&1wv\i"96..,@~"\IDYH鳰,{z%B5[ЃُS.+$ORi4P&Lr5 ,6i[e}]ieZfʥ8^u5|ƧĝgQ)xX0+tKq=T(v;_bxdj[P-l_91N13zJr4rY$ү\B^8DS0%q+ tߠvKA tRǨݚ38btgڭp&a*HjoU\-QIvktezL\g5VՀ1% G5WIɯ BR@P&0\}McmBx@j2, 13546ms (14:29:39.806) Jan 21 14:29:39 crc kubenswrapper[4720]: Trace[1265285902]: [13.546843953s] [13.546843953s] END Jan 21 14:29:39 crc kubenswrapper[4720]: I0121 14:29:39.806503 4720 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 21 14:29:39 crc kubenswrapper[4720]: I0121 14:29:39.806994 4720 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Jan 21 14:29:39 crc kubenswrapper[4720]: I0121 14:29:39.807585 4720 trace.go:236] Trace[1620683101]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (21-Jan-2026 14:29:25.838) (total time: 13969ms): Jan 21 14:29:39 crc kubenswrapper[4720]: Trace[1620683101]: ---"Objects listed" error: 13969ms (14:29:39.807) Jan 21 14:29:39 crc kubenswrapper[4720]: Trace[1620683101]: [13.969146287s] [13.969146287s] END Jan 21 14:29:39 crc kubenswrapper[4720]: I0121 14:29:39.807604 4720 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 21 14:29:39 crc kubenswrapper[4720]: I0121 14:29:39.808089 4720 trace.go:236] Trace[1077416000]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (21-Jan-2026 14:29:25.148) (total time: 14659ms): Jan 21 14:29:39 crc kubenswrapper[4720]: Trace[1077416000]: ---"Objects listed" error: 14659ms (14:29:39.807) Jan 21 14:29:39 crc kubenswrapper[4720]: Trace[1077416000]: [14.659375726s] [14.659375726s] END Jan 21 14:29:39 crc kubenswrapper[4720]: I0121 14:29:39.808110 4720 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 21 14:29:39 crc kubenswrapper[4720]: I0121 14:29:39.808397 4720 trace.go:236] Trace[798014851]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (21-Jan-2026 14:29:27.041) (total time: 12766ms): Jan 21 14:29:39 crc kubenswrapper[4720]: Trace[798014851]: ---"Objects listed" error: 12766ms (14:29:39.808) Jan 21 14:29:39 crc kubenswrapper[4720]: Trace[798014851]: [12.766663278s] [12.766663278s] END Jan 21 14:29:39 crc kubenswrapper[4720]: I0121 14:29:39.808417 4720 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 21 14:29:39 crc kubenswrapper[4720]: E0121 14:29:39.809938 4720 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Jan 21 14:29:39 crc kubenswrapper[4720]: I0121 14:29:39.842336 4720 csr.go:261] certificate signing request csr-x7cfs is approved, waiting to be issued Jan 21 14:29:39 crc kubenswrapper[4720]: I0121 14:29:39.859140 4720 csr.go:257] certificate signing request csr-x7cfs is issued Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.286092 4720 apiserver.go:52] "Watching apiserver" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.290114 4720 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.290338 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.290655 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.290837 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.290905 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.290959 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.291005 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.291069 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.291716 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.291760 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.291792 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.292458 4720 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.293772 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.294151 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.294182 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.294483 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.294574 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.295359 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.297211 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.297342 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.297711 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.301927 4720 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 03:11:52.193791656 +0000 UTC Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309223 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309270 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309318 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309345 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309363 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309381 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309400 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309423 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309445 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309464 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309483 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309503 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309557 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309578 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309613 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309633 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309657 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309693 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309733 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309755 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309774 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309812 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309834 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309854 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309874 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309894 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309918 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309967 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309980 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309987 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310045 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310077 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310101 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310123 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310144 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310166 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310190 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310211 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310232 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310255 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310278 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310299 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310320 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310364 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310385 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310405 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310433 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310458 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310479 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310495 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310509 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310526 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310542 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310558 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310572 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310603 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310617 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310633 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310646 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310663 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310693 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310848 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310865 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310880 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310897 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310915 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310929 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310943 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310959 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310972 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310989 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311004 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311027 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311048 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311063 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311079 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311095 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311109 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311124 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311139 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311153 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311167 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311181 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311196 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311216 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311230 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311244 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311259 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311275 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311289 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311305 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311320 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311335 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311350 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311426 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311448 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311465 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311480 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311515 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311531 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311547 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311561 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311577 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311592 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311607 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311621 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311638 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311658 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311708 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311724 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311739 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311754 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311769 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311784 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311801 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311817 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311832 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311847 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311862 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311878 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311894 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311910 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311928 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311944 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311958 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311974 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312024 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312042 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312058 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312074 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312089 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312105 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312122 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312139 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312154 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312169 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312184 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312200 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312218 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312233 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312248 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312263 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312279 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312294 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312312 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312328 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312343 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312361 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312382 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312406 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312423 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312439 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312455 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312471 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312486 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312501 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312517 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312532 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312549 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312564 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312579 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312595 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312611 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312626 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312642 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312665 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312692 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312709 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312726 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312743 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312759 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312775 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312791 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312808 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312823 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312839 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312856 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312872 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312888 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312905 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312921 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312938 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312954 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312970 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312987 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313003 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313020 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313037 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313054 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313071 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313088 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313105 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313121 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313137 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313155 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313171 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313209 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313236 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313254 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313271 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313292 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313312 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313329 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313347 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313369 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313395 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313415 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313433 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313453 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313469 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313514 4720 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.318246 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.330088 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.331821 4720 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310187 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310403 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310569 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310790 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310926 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311063 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311223 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311709 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312060 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312211 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312566 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312957 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313170 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313354 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313453 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313528 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313894 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.314106 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.314231 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.345825 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.345843 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.314702 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.314714 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.314763 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.314842 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.314904 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.315024 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.315124 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.315200 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.315504 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.315621 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.315955 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.316010 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.316208 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.316229 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.346704 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.316351 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.316365 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.316481 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.316761 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.317663 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.318144 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.318302 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.318372 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.318470 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.318545 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.318632 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.318751 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.318828 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.318857 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.321771 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.322159 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.322166 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.322342 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.322350 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.322374 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.323008 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.323209 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.323374 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.323480 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.324364 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.324788 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.324952 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.324998 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.325176 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.325483 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.325734 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.325783 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.325880 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.325986 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.326009 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.326165 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.326165 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.326224 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.326338 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.326424 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.326461 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.326504 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.326617 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.326729 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.326802 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.327160 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.327340 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.327379 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.327499 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.327556 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.327596 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.327857 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.327870 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.328154 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.328253 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.328523 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.328575 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.328861 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.328898 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.328936 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.329581 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.329891 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.330341 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.330535 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.338745 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.340123 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.340466 4720 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.341214 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.341409 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.341405 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.341805 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.342469 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.345140 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.345154 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.345344 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.345395 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.345542 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.314436 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.346030 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.346403 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.346479 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.346543 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.346784 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.347167 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.347342 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.347353 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.347498 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.347519 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.347729 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.347804 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.347954 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.348072 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.348388 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.348689 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.348834 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.348849 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.350553 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.350734 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.350813 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.350913 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.350876 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.351080 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.351294 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.351376 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:29:40.851222441 +0000 UTC m=+18.759962373 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.351423 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.351918 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.352134 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.352241 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.352475 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.353582 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.354397 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.354487 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.354760 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.354826 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.341957 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.355382 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.355459 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.355690 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.356092 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.358566 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.358579 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.359395 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.359609 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.359731 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.358001 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.363083 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.359488 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.364827 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.365093 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.365929 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.366146 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.366460 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.366723 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.341786 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.364640 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.371449 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.372035 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.372222 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.372750 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.373205 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.373516 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.374018 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.374339 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.375070 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.375392 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.375478 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.371353 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.371904 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.377359 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.377501 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.373791 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.377758 4720 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.377930 4720 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.378011 4720 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.378080 4720 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.379789 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.380625 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.377294 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.380697 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 14:29:40.877574565 +0000 UTC m=+18.786314587 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.380734 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 14:29:40.880717527 +0000 UTC m=+18.789457459 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.380749 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.382361 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.382416 4720 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.382430 4720 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.382442 4720 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.382519 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-21 14:29:40.882497789 +0000 UTC m=+18.791237801 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.382553 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-21 14:29:40.882545801 +0000 UTC m=+18.791285823 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.384814 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.386121 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.386433 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.387021 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.392775 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.394201 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.403850 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.414801 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.414873 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.414929 4720 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.414942 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.414952 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.414961 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.414968 4720 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.414977 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.414985 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.414994 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415003 4720 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415011 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415020 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415027 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415036 4720 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415046 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415057 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415067 4720 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415076 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415084 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415092 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415100 4720 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415107 4720 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415115 4720 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415123 4720 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415132 4720 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415140 4720 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415164 4720 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415172 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415182 4720 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415191 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415200 4720 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415208 4720 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415234 4720 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415242 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415249 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415257 4720 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415266 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415274 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415282 4720 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415292 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415301 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415309 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415317 4720 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415325 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415335 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415344 4720 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415366 4720 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415374 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415382 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415390 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415399 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415407 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415415 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415423 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415433 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415449 4720 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415467 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415480 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415509 4720 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415519 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415529 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415541 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415549 4720 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415557 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415549 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415566 4720 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415624 4720 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415644 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415664 4720 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415694 4720 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415708 4720 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415720 4720 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415743 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415755 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415768 4720 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415780 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415794 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415806 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415821 4720 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415836 4720 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415848 4720 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415859 4720 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415871 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415884 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415896 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415910 4720 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415922 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415934 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415945 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415957 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415968 4720 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415980 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415992 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416004 4720 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416016 4720 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416027 4720 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416038 4720 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416050 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416062 4720 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416074 4720 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416086 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416099 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416112 4720 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416124 4720 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416136 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416149 4720 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416159 4720 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416169 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416180 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416192 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416204 4720 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416221 4720 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416234 4720 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416248 4720 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416258 4720 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416268 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416278 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416288 4720 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416298 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416309 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416319 4720 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416616 4720 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416629 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416643 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416666 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.418432 4720 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.418531 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.418590 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.418640 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420284 4720 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420307 4720 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420320 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420333 4720 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420345 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420359 4720 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420370 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420385 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420398 4720 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420409 4720 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420422 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420433 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420447 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420460 4720 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420472 4720 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420484 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420495 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420506 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420517 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420544 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420555 4720 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420567 4720 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420579 4720 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420590 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420601 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420612 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420624 4720 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420635 4720 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420646 4720 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420682 4720 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420696 4720 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420707 4720 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420720 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420731 4720 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420742 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420754 4720 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420764 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420775 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420787 4720 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420798 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420810 4720 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420821 4720 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420834 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420845 4720 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420857 4720 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420869 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420881 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420894 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420906 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420917 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420928 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420940 4720 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420956 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420968 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420980 4720 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420991 4720 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.421002 4720 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.421015 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.421026 4720 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.421037 4720 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.421048 4720 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.421059 4720 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.417044 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416325 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.423817 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.426046 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.428300 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.428653 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.438613 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.448241 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.457856 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.469287 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.480617 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.489581 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.501334 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.521627 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.521673 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.521684 4720 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.521696 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.606329 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 14:29:40 crc kubenswrapper[4720]: W0121 14:29:40.618608 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-dcc98c6d0530f6616d1fee7d801e4b7de085f11577f38f7289bbbd7c5302fae3 WatchSource:0}: Error finding container dcc98c6d0530f6616d1fee7d801e4b7de085f11577f38f7289bbbd7c5302fae3: Status 404 returned error can't find the container with id dcc98c6d0530f6616d1fee7d801e4b7de085f11577f38f7289bbbd7c5302fae3 Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.628374 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 14:29:40 crc kubenswrapper[4720]: W0121 14:29:40.640001 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-658a7cc518b890647895600c4fba948c34c0f90c1d63341097a52d5f005f4ace WatchSource:0}: Error finding container 658a7cc518b890647895600c4fba948c34c0f90c1d63341097a52d5f005f4ace: Status 404 returned error can't find the container with id 658a7cc518b890647895600c4fba948c34c0f90c1d63341097a52d5f005f4ace Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.675414 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.675451 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.682381 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.683267 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.684723 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.686123 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.686722 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.687756 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.688244 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.688395 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: W0121 14:29:40.688862 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-ff7fa90457235efe70031af89f119c7b173bc034b46bd828c0f7216ac3ad5cc6 WatchSource:0}: Error finding container ff7fa90457235efe70031af89f119c7b173bc034b46bd828c0f7216ac3ad5cc6: Status 404 returned error can't find the container with id ff7fa90457235efe70031af89f119c7b173bc034b46bd828c0f7216ac3ad5cc6 Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.689051 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.690067 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.690547 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.695309 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.700588 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.701088 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.702329 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.702633 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.702828 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.703729 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.704253 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.704636 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.706429 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.707102 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.707613 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.713000 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.713462 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.714593 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.714988 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.716208 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.717125 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.718150 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.718207 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.719439 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.720545 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.721765 4720 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.721883 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.723499 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.724091 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.724891 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.726688 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.727315 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.727736 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.728163 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.728747 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.729864 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.730289 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.731970 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.732537 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.733851 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.734430 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.735422 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.736307 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.737509 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.737976 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.738799 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.739384 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.740462 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.741013 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.741422 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.741788 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.742611 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.742645 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.751356 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.757680 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ff7fa90457235efe70031af89f119c7b173bc034b46bd828c0f7216ac3ad5cc6"} Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.758616 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"658a7cc518b890647895600c4fba948c34c0f90c1d63341097a52d5f005f4ace"} Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.759267 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"dcc98c6d0530f6616d1fee7d801e4b7de085f11577f38f7289bbbd7c5302fae3"} Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.760879 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.762896 4720 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f11ff07fdbcacd193207581f957a529571688ff0d97b7a999b16778f106da754" exitCode=255 Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.762934 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f11ff07fdbcacd193207581f957a529571688ff0d97b7a999b16778f106da754"} Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.764910 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.777594 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.788505 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.790175 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.791363 4720 scope.go:117] "RemoveContainer" containerID="f11ff07fdbcacd193207581f957a529571688ff0d97b7a999b16778f106da754" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.801581 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.813799 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.827582 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ab4db5b-b639-46bf-b87c-053109420c2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b66da070ca2e79fa6013abc854fba85856cc805366bb90b35c7c41f38c4ee362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0bce3b2637abe56352f67fde03ad8f25f3e40b810255de8fe4eb60361bae579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a61cfef1d25e81c7dd55aaf21e827449c2eb622097df0d924e5e2dfebc41d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7607a1d64a04f656ff3b4f81c9b30350791bdc08b7a909de30e247eee4422dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.839633 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.850112 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ab4db5b-b639-46bf-b87c-053109420c2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b66da070ca2e79fa6013abc854fba85856cc805366bb90b35c7c41f38c4ee362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0bce3b2637abe56352f67fde03ad8f25f3e40b810255de8fe4eb60361bae579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a61cfef1d25e81c7dd55aaf21e827449c2eb622097df0d924e5e2dfebc41d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7607a1d64a04f656ff3b4f81c9b30350791bdc08b7a909de30e247eee4422dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.860105 4720 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-01-21 14:24:39 +0000 UTC, rotation deadline is 2026-11-11 09:26:44.839555046 +0000 UTC Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.860153 4720 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7050h57m3.979404603s for next certificate rotation Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.867687 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.878262 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.891825 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.906165 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.916088 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.924982 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.925050 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.925073 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.925095 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.925114 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.925180 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:29:41.925153862 +0000 UTC m=+19.833893794 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.925234 4720 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.925283 4720 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.925321 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 14:29:41.925269475 +0000 UTC m=+19.834009407 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.925329 4720 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.925361 4720 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.925406 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-21 14:29:41.925393969 +0000 UTC m=+19.834133971 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.925451 4720 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.925498 4720 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.925514 4720 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.925461 4720 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.925577 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-21 14:29:41.925556334 +0000 UTC m=+19.834296336 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.925598 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 14:29:41.925588814 +0000 UTC m=+19.834328746 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.926826 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.938358 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e4ae1e-77b8-40b8-9f64-1eba5a39188a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://696c35d112b31d6eeef69581faf96012141e7e530d66d1fc6e359dbaa43d115d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fa84d74fb16eedfc1ec6d2bbf04ec9d74b586ed9aa369ad2f15938af4d956c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f82adfc2ca5b7537bfd8bdf7682f4ec9c8303dcf4e2f9cf82d3523d2d2f26f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f11ff07fdbcacd193207581f957a529571688ff0d97b7a999b16778f106da754\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f11ff07fdbcacd193207581f957a529571688ff0d97b7a999b16778f106da754\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"le observer\\\\nW0121 14:29:39.836718 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 14:29:39.836860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 14:29:39.841807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029814202/tls.crt::/tmp/serving-cert-4029814202/tls.key\\\\\\\"\\\\nI0121 14:29:40.119738 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 14:29:40.121790 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 14:29:40.121812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 14:29:40.121864 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 14:29:40.121871 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 14:29:40.137231 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 14:29:40.137263 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 14:29:40.137268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 14:29:40.137274 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 14:29:40.137279 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 14:29:40.137282 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 14:29:40.137286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 14:29:40.137463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 14:29:40.138686 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07909e8fcad66d4eb5e5d5970161495fba15d03881bec826cc68c42b6e39a179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T14:29:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.302456 4720 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 08:55:04.753390073 +0000 UTC Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.367891 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-k4qfb"] Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.368194 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-k4qfb" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.370031 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.373623 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.375212 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.419610 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e4ae1e-77b8-40b8-9f64-1eba5a39188a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://696c35d112b31d6eeef69581faf96012141e7e530d66d1fc6e359dbaa43d115d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fa84d74fb16eedfc1ec6d2bbf04ec9d74b586ed9aa369ad2f15938af4d956c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f82adfc2ca5b7537bfd8bdf7682f4ec9c8303dcf4e2f9cf82d3523d2d2f26f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f11ff07fdbcacd193207581f957a529571688ff0d97b7a999b16778f106da754\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f11ff07fdbcacd193207581f957a529571688ff0d97b7a999b16778f106da754\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"le observer\\\\nW0121 14:29:39.836718 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 14:29:39.836860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 14:29:39.841807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029814202/tls.crt::/tmp/serving-cert-4029814202/tls.key\\\\\\\"\\\\nI0121 14:29:40.119738 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 14:29:40.121790 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 14:29:40.121812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 14:29:40.121864 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 14:29:40.121871 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 14:29:40.137231 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 14:29:40.137263 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 14:29:40.137268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 14:29:40.137274 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 14:29:40.137279 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 14:29:40.137282 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 14:29:40.137286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 14:29:40.137463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 14:29:40.138686 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07909e8fcad66d4eb5e5d5970161495fba15d03881bec826cc68c42b6e39a179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T14:29:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:41Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.429341 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq9z7\" (UniqueName: \"kubernetes.io/projected/d24af441-df03-462d-914a-165777766cf4-kube-api-access-vq9z7\") pod \"node-resolver-k4qfb\" (UID: \"d24af441-df03-462d-914a-165777766cf4\") " pod="openshift-dns/node-resolver-k4qfb" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.429378 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d24af441-df03-462d-914a-165777766cf4-hosts-file\") pod \"node-resolver-k4qfb\" (UID: \"d24af441-df03-462d-914a-165777766cf4\") " pod="openshift-dns/node-resolver-k4qfb" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.454297 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24af441-df03-462d-914a-165777766cf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vq9z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:41Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.481165 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ab4db5b-b639-46bf-b87c-053109420c2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b66da070ca2e79fa6013abc854fba85856cc805366bb90b35c7c41f38c4ee362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0bce3b2637abe56352f67fde03ad8f25f3e40b810255de8fe4eb60361bae579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a61cfef1d25e81c7dd55aaf21e827449c2eb622097df0d924e5e2dfebc41d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7607a1d64a04f656ff3b4f81c9b30350791bdc08b7a909de30e247eee4422dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:41Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.509161 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:41Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.530004 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d24af441-df03-462d-914a-165777766cf4-hosts-file\") pod \"node-resolver-k4qfb\" (UID: \"d24af441-df03-462d-914a-165777766cf4\") " pod="openshift-dns/node-resolver-k4qfb" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.530063 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq9z7\" (UniqueName: \"kubernetes.io/projected/d24af441-df03-462d-914a-165777766cf4-kube-api-access-vq9z7\") pod \"node-resolver-k4qfb\" (UID: \"d24af441-df03-462d-914a-165777766cf4\") " pod="openshift-dns/node-resolver-k4qfb" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.530141 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d24af441-df03-462d-914a-165777766cf4-hosts-file\") pod \"node-resolver-k4qfb\" (UID: \"d24af441-df03-462d-914a-165777766cf4\") " pod="openshift-dns/node-resolver-k4qfb" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.547467 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:41Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.566963 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq9z7\" (UniqueName: \"kubernetes.io/projected/d24af441-df03-462d-914a-165777766cf4-kube-api-access-vq9z7\") pod \"node-resolver-k4qfb\" (UID: \"d24af441-df03-462d-914a-165777766cf4\") " pod="openshift-dns/node-resolver-k4qfb" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.577175 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:41Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.600135 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:41Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.618666 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:41Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.648285 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:41Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.677291 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.677302 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 14:29:41 crc kubenswrapper[4720]: E0121 14:29:41.677520 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 14:29:41 crc kubenswrapper[4720]: E0121 14:29:41.677405 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.690366 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-k4qfb" Jan 21 14:29:41 crc kubenswrapper[4720]: W0121 14:29:41.702469 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd24af441_df03_462d_914a_165777766cf4.slice/crio-36032456655a7ede9461e8a27523b83f4363e842c4a0eb061ed1e93b930a06be WatchSource:0}: Error finding container 36032456655a7ede9461e8a27523b83f4363e842c4a0eb061ed1e93b930a06be: Status 404 returned error can't find the container with id 36032456655a7ede9461e8a27523b83f4363e842c4a0eb061ed1e93b930a06be Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.774908 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-k4qfb" event={"ID":"d24af441-df03-462d-914a-165777766cf4","Type":"ContainerStarted","Data":"36032456655a7ede9461e8a27523b83f4363e842c4a0eb061ed1e93b930a06be"} Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.776895 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.782612 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4a142cc64cbb49f253527712c0d0f4fb3537591d888594bcc703d033b47f67b1"} Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.782960 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.784357 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"375c1a86c17f5272e6bc4717537fe63a4c86280b1a292edd1676035544d5c4ee"} Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.786604 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e0298c3b240af732b45430e55df1591c7995466acad5aa3551a12a74b6f7b06f"} Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.786638 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7768c6c697364cf9f063ee60e27385f44ea0f734c4c908282d9643056742b93d"} Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.817956 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:41Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.818918 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-2pbsk"] Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.819244 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.820608 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-5r9wf"] Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.821097 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.826354 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.826532 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.827557 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-w85dm"] Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.827871 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-w85dm" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.833985 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.834701 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.835061 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.849512 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.849787 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.849836 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.849902 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.850141 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.850343 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zr5bd"] Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.855751 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.856771 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.860960 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.864049 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.866457 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.866745 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.866866 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.866944 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.867062 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ab4db5b-b639-46bf-b87c-053109420c2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b66da070ca2e79fa6013abc854fba85856cc805366bb90b35c7c41f38c4ee362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0bce3b2637abe56352f67fde03ad8f25f3e40b810255de8fe4eb60361bae579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a61cfef1d25e81c7dd55aaf21e827449c2eb622097df0d924e5e2dfebc41d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7607a1d64a04f656ff3b4f81c9b30350791bdc08b7a909de30e247eee4422dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:41Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.867251 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.867425 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.897481 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:41Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.915249 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:41Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.931492 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:41Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.933568 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.933680 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.933709 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-etc-openvswitch\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.933730 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-ovnkube-script-lib\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.933751 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-multus-socket-dir-parent\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.933770 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz2xr\" (UniqueName: \"kubernetes.io/projected/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-kube-api-access-dz2xr\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.933790 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-multus-cni-dir\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.933810 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-host-run-netns\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.933827 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-hostroot\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.933853 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c1128ddd-06c2-4255-aa17-b62aa0f8a996-proxy-tls\") pod \"machine-config-daemon-2pbsk\" (UID: \"c1128ddd-06c2-4255-aa17-b62aa0f8a996\") " pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.933874 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-multus-conf-dir\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.933896 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/14cdc412-e60b-4b9b-b37d-33b1f061f44d-system-cni-dir\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.933916 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/14cdc412-e60b-4b9b-b37d-33b1f061f44d-cnibin\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.933934 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/14cdc412-e60b-4b9b-b37d-33b1f061f44d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.933953 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-systemd-units\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.933971 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-host-run-k8s-cni-cncf-io\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.933991 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-multus-daemon-config\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934011 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-run-ovn\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934029 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-ovnkube-config\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934045 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-cnibin\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934216 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934242 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-cni-bin\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934257 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-env-overrides\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934271 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-cni-binary-copy\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934287 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c1128ddd-06c2-4255-aa17-b62aa0f8a996-mcd-auth-proxy-config\") pod \"machine-config-daemon-2pbsk\" (UID: \"c1128ddd-06c2-4255-aa17-b62aa0f8a996\") " pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934302 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm9d5\" (UniqueName: \"kubernetes.io/projected/c1128ddd-06c2-4255-aa17-b62aa0f8a996-kube-api-access-vm9d5\") pod \"machine-config-daemon-2pbsk\" (UID: \"c1128ddd-06c2-4255-aa17-b62aa0f8a996\") " pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934316 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/14cdc412-e60b-4b9b-b37d-33b1f061f44d-os-release\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934330 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-run-ovn-kubernetes\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934347 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-host-var-lib-cni-multus\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934364 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934379 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/14cdc412-e60b-4b9b-b37d-33b1f061f44d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934394 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-os-release\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934408 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-host-var-lib-kubelet\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934426 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8fhj\" (UniqueName: \"kubernetes.io/projected/14cdc412-e60b-4b9b-b37d-33b1f061f44d-kube-api-access-w8fhj\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934440 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-slash\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934453 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-run-netns\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934470 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-node-log\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934490 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-ovn-node-metrics-cert\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934508 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-run-systemd\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934524 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-var-lib-openvswitch\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934538 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-run-openvswitch\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934553 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvf2r\" (UniqueName: \"kubernetes.io/projected/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-kube-api-access-kvf2r\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934572 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-etc-kubernetes\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934593 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-host-var-lib-cni-bin\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934610 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-host-run-multus-certs\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934625 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/14cdc412-e60b-4b9b-b37d-33b1f061f44d-cni-binary-copy\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934639 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-kubelet\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934654 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934690 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-log-socket\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934713 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-cni-netd\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934732 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934746 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-system-cni-dir\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934760 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c1128ddd-06c2-4255-aa17-b62aa0f8a996-rootfs\") pod \"machine-config-daemon-2pbsk\" (UID: \"c1128ddd-06c2-4255-aa17-b62aa0f8a996\") " pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:29:41 crc kubenswrapper[4720]: E0121 14:29:41.934848 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:29:43.934834233 +0000 UTC m=+21.843574165 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:41 crc kubenswrapper[4720]: E0121 14:29:41.934892 4720 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 14:29:41 crc kubenswrapper[4720]: E0121 14:29:41.934919 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 14:29:43.934913955 +0000 UTC m=+21.843653887 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 14:29:41 crc kubenswrapper[4720]: E0121 14:29:41.935111 4720 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 14:29:41 crc kubenswrapper[4720]: E0121 14:29:41.935124 4720 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 14:29:41 crc kubenswrapper[4720]: E0121 14:29:41.935133 4720 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 14:29:41 crc kubenswrapper[4720]: E0121 14:29:41.935185 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-21 14:29:43.935177553 +0000 UTC m=+21.843917485 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 14:29:41 crc kubenswrapper[4720]: E0121 14:29:41.935278 4720 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 14:29:41 crc kubenswrapper[4720]: E0121 14:29:41.935293 4720 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 14:29:41 crc kubenswrapper[4720]: E0121 14:29:41.935299 4720 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 14:29:41 crc kubenswrapper[4720]: E0121 14:29:41.935317 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-21 14:29:43.935312156 +0000 UTC m=+21.844052098 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 14:29:41 crc kubenswrapper[4720]: E0121 14:29:41.935442 4720 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 14:29:41 crc kubenswrapper[4720]: E0121 14:29:41.935461 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 14:29:43.93545597 +0000 UTC m=+21.844195902 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.946510 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:41Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.963250 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:41Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.985670 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e4ae1e-77b8-40b8-9f64-1eba5a39188a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://696c35d112b31d6eeef69581faf96012141e7e530d66d1fc6e359dbaa43d115d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fa84d74fb16eedfc1ec6d2bbf04ec9d74b586ed9aa369ad2f15938af4d956c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f82adfc2ca5b7537bfd8bdf7682f4ec9c8303dcf4e2f9cf82d3523d2d2f26f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a142cc64cbb49f253527712c0d0f4fb3537591d888594bcc703d033b47f67b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f11ff07fdbcacd193207581f957a529571688ff0d97b7a999b16778f106da754\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"le observer\\\\nW0121 14:29:39.836718 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 14:29:39.836860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 14:29:39.841807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029814202/tls.crt::/tmp/serving-cert-4029814202/tls.key\\\\\\\"\\\\nI0121 14:29:40.119738 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 14:29:40.121790 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 14:29:40.121812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 14:29:40.121864 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 14:29:40.121871 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 14:29:40.137231 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 14:29:40.137263 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 14:29:40.137268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 14:29:40.137274 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 14:29:40.137279 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 14:29:40.137282 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 14:29:40.137286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 14:29:40.137463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 14:29:40.138686 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07909e8fcad66d4eb5e5d5970161495fba15d03881bec826cc68c42b6e39a179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T14:29:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:41Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.997520 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24af441-df03-462d-914a-165777766cf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vq9z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:41Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.009594 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://375c1a86c17f5272e6bc4717537fe63a4c86280b1a292edd1676035544d5c4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.020945 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.031956 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035097 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-etc-kubernetes\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035127 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-host-var-lib-cni-bin\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035147 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-host-run-multus-certs\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035170 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/14cdc412-e60b-4b9b-b37d-33b1f061f44d-cni-binary-copy\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035186 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-kubelet\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035204 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-log-socket\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035218 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-cni-netd\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035235 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035244 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-kubelet\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035267 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-cni-netd\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035288 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-system-cni-dir\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035251 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-system-cni-dir\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035243 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-host-run-multus-certs\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035294 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035214 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-etc-kubernetes\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035322 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c1128ddd-06c2-4255-aa17-b62aa0f8a996-rootfs\") pod \"machine-config-daemon-2pbsk\" (UID: \"c1128ddd-06c2-4255-aa17-b62aa0f8a996\") " pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035339 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c1128ddd-06c2-4255-aa17-b62aa0f8a996-rootfs\") pod \"machine-config-daemon-2pbsk\" (UID: \"c1128ddd-06c2-4255-aa17-b62aa0f8a996\") " pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035353 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-etc-openvswitch\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035355 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-log-socket\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035392 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-etc-openvswitch\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035371 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-ovnkube-script-lib\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035439 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-multus-socket-dir-parent\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035461 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz2xr\" (UniqueName: \"kubernetes.io/projected/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-kube-api-access-dz2xr\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035477 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-multus-cni-dir\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035491 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-host-run-netns\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035505 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-hostroot\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035521 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c1128ddd-06c2-4255-aa17-b62aa0f8a996-proxy-tls\") pod \"machine-config-daemon-2pbsk\" (UID: \"c1128ddd-06c2-4255-aa17-b62aa0f8a996\") " pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035539 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-multus-conf-dir\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035559 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-multus-daemon-config\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035582 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/14cdc412-e60b-4b9b-b37d-33b1f061f44d-system-cni-dir\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035597 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/14cdc412-e60b-4b9b-b37d-33b1f061f44d-cnibin\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035612 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/14cdc412-e60b-4b9b-b37d-33b1f061f44d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035631 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-systemd-units\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035645 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-host-run-k8s-cni-cncf-io\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035661 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-run-ovn\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035711 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-ovnkube-config\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035728 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-cnibin\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035769 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-cni-bin\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035787 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-env-overrides\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035804 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-cni-binary-copy\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035820 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c1128ddd-06c2-4255-aa17-b62aa0f8a996-mcd-auth-proxy-config\") pod \"machine-config-daemon-2pbsk\" (UID: \"c1128ddd-06c2-4255-aa17-b62aa0f8a996\") " pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035836 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm9d5\" (UniqueName: \"kubernetes.io/projected/c1128ddd-06c2-4255-aa17-b62aa0f8a996-kube-api-access-vm9d5\") pod \"machine-config-daemon-2pbsk\" (UID: \"c1128ddd-06c2-4255-aa17-b62aa0f8a996\") " pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035851 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/14cdc412-e60b-4b9b-b37d-33b1f061f44d-cni-binary-copy\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035862 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/14cdc412-e60b-4b9b-b37d-33b1f061f44d-os-release\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035916 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-run-ovn-kubernetes\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035935 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-host-var-lib-cni-multus\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035951 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/14cdc412-e60b-4b9b-b37d-33b1f061f44d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035958 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-ovnkube-script-lib\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035970 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-os-release\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035997 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-host-var-lib-kubelet\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036018 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8fhj\" (UniqueName: \"kubernetes.io/projected/14cdc412-e60b-4b9b-b37d-33b1f061f44d-kube-api-access-w8fhj\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036041 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-slash\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036081 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-run-netns\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036096 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-node-log\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036110 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-ovn-node-metrics-cert\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036127 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-run-systemd\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036140 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-var-lib-openvswitch\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036155 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-run-openvswitch\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036172 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvf2r\" (UniqueName: \"kubernetes.io/projected/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-kube-api-access-kvf2r\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036189 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-hostroot\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036235 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-os-release\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036268 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-systemd-units\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036288 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/14cdc412-e60b-4b9b-b37d-33b1f061f44d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036290 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-host-run-k8s-cni-cncf-io\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036312 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-run-ovn\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036332 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-multus-socket-dir-parent\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036371 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-host-var-lib-kubelet\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036499 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-slash\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036522 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-run-netns\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036538 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-multus-cni-dir\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036543 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-node-log\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036566 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-host-run-netns\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036765 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-host-var-lib-cni-bin\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036798 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-var-lib-openvswitch\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036837 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-run-systemd\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036879 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-ovnkube-config\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.037138 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/14cdc412-e60b-4b9b-b37d-33b1f061f44d-os-release\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.037166 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-run-ovn-kubernetes\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.037191 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-host-var-lib-cni-multus\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.037191 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-multus-daemon-config\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.037217 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-multus-conf-dir\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.037259 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-cni-bin\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.037294 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-cnibin\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.037401 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/14cdc412-e60b-4b9b-b37d-33b1f061f44d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.037560 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c1128ddd-06c2-4255-aa17-b62aa0f8a996-mcd-auth-proxy-config\") pod \"machine-config-daemon-2pbsk\" (UID: \"c1128ddd-06c2-4255-aa17-b62aa0f8a996\") " pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.037600 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-run-openvswitch\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.037639 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-env-overrides\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.037739 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/14cdc412-e60b-4b9b-b37d-33b1f061f44d-system-cni-dir\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.037755 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/14cdc412-e60b-4b9b-b37d-33b1f061f44d-cnibin\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.037898 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-cni-binary-copy\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.039970 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-ovn-node-metrics-cert\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.040484 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c1128ddd-06c2-4255-aa17-b62aa0f8a996-proxy-tls\") pod \"machine-config-daemon-2pbsk\" (UID: \"c1128ddd-06c2-4255-aa17-b62aa0f8a996\") " pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.048757 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5r9wf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14cdc412-e60b-4b9b-b37d-33b1f061f44d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8fhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8fhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8fhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8fhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8fhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8fhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8fhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5r9wf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.055623 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm9d5\" (UniqueName: \"kubernetes.io/projected/c1128ddd-06c2-4255-aa17-b62aa0f8a996-kube-api-access-vm9d5\") pod \"machine-config-daemon-2pbsk\" (UID: \"c1128ddd-06c2-4255-aa17-b62aa0f8a996\") " pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.056293 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvf2r\" (UniqueName: \"kubernetes.io/projected/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-kube-api-access-kvf2r\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.057861 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz2xr\" (UniqueName: \"kubernetes.io/projected/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-kube-api-access-dz2xr\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.058124 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8fhj\" (UniqueName: \"kubernetes.io/projected/14cdc412-e60b-4b9b-b37d-33b1f061f44d-kube-api-access-w8fhj\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.067949 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w85dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz2xr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w85dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.078152 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ab4db5b-b639-46bf-b87c-053109420c2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b66da070ca2e79fa6013abc854fba85856cc805366bb90b35c7c41f38c4ee362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0bce3b2637abe56352f67fde03ad8f25f3e40b810255de8fe4eb60361bae579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a61cfef1d25e81c7dd55aaf21e827449c2eb622097df0d924e5e2dfebc41d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7607a1d64a04f656ff3b4f81c9b30350791bdc08b7a909de30e247eee4422dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.086511 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24af441-df03-462d-914a-165777766cf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vq9z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.105508 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e4ae1e-77b8-40b8-9f64-1eba5a39188a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://696c35d112b31d6eeef69581faf96012141e7e530d66d1fc6e359dbaa43d115d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fa84d74fb16eedfc1ec6d2bbf04ec9d74b586ed9aa369ad2f15938af4d956c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f82adfc2ca5b7537bfd8bdf7682f4ec9c8303dcf4e2f9cf82d3523d2d2f26f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a142cc64cbb49f253527712c0d0f4fb3537591d888594bcc703d033b47f67b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f11ff07fdbcacd193207581f957a529571688ff0d97b7a999b16778f106da754\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"le observer\\\\nW0121 14:29:39.836718 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 14:29:39.836860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 14:29:39.841807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029814202/tls.crt::/tmp/serving-cert-4029814202/tls.key\\\\\\\"\\\\nI0121 14:29:40.119738 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 14:29:40.121790 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 14:29:40.121812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 14:29:40.121864 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 14:29:40.121871 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 14:29:40.137231 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 14:29:40.137263 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 14:29:40.137268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 14:29:40.137274 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 14:29:40.137279 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 14:29:40.137282 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 14:29:40.137286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 14:29:40.137463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 14:29:40.138686 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07909e8fcad66d4eb5e5d5970161495fba15d03881bec826cc68c42b6e39a179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T14:29:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.119761 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0298c3b240af732b45430e55df1591c7995466acad5aa3551a12a74b6f7b06f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7768c6c697364cf9f063ee60e27385f44ea0f734c4c908282d9643056742b93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.130437 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.134393 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.144083 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1128ddd_06c2_4255_aa17_b62aa0f8a996.slice/crio-c77903ace690b2415d0afa0b85b0180f68914da3564489e726e04c99d3930f4b WatchSource:0}: Error finding container c77903ace690b2415d0afa0b85b0180f68914da3564489e726e04c99d3930f4b: Status 404 returned error can't find the container with id c77903ace690b2415d0afa0b85b0180f68914da3564489e726e04c99d3930f4b Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.144948 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.156975 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.157278 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14cdc412_e60b_4b9b_b37d_33b1f061f44d.slice/crio-6d28cafbb02a492ec5764de06716b90abe724a20b5f175b9b2a9240735e637ee WatchSource:0}: Error finding container 6d28cafbb02a492ec5764de06716b90abe724a20b5f175b9b2a9240735e637ee: Status 404 returned error can't find the container with id 6d28cafbb02a492ec5764de06716b90abe724a20b5f175b9b2a9240735e637ee Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.163900 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1128ddd-06c2-4255-aa17-b62aa0f8a996\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm9d5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm9d5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2pbsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.172434 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.172854 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda40805c6_ef8a_4ae0_bb5b_1834d257e8c6.slice/crio-a6517f927ed93f2b097ba1c364eef63a89682d8194fb196d8a1b774255191b79 WatchSource:0}: Error finding container a6517f927ed93f2b097ba1c364eef63a89682d8194fb196d8a1b774255191b79: Status 404 returned error can't find the container with id a6517f927ed93f2b097ba1c364eef63a89682d8194fb196d8a1b774255191b79 Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.191919 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvf2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvf2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvf2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvf2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvf2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvf2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvf2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvf2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvf2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zr5bd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.196645 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac61c15b_6fe9_4c83_9ca7_588095ab1a29.slice/crio-ca1757282192974108b64124881bc36690fc3400e42954719815f361ddc7c63e WatchSource:0}: Error finding container ca1757282192974108b64124881bc36690fc3400e42954719815f361ddc7c63e: Status 404 returned error can't find the container with id ca1757282192974108b64124881bc36690fc3400e42954719815f361ddc7c63e Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.220395 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.227715 4720 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.228041 4720 reflector.go:484] object-"openshift-machine-config-operator"/"kube-rbac-proxy": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-machine-config-operator"/"kube-rbac-proxy": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.228071 4720 reflector.go:484] object-"openshift-multus"/"default-dockercfg-2q5b6": watch of *v1.Secret ended with: very short watch: object-"openshift-multus"/"default-dockercfg-2q5b6": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.228092 4720 reflector.go:484] object-"openshift-ovn-kubernetes"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-ovn-kubernetes"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.228119 4720 reflector.go:484] object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.228128 4720 reflector.go:484] object-"openshift-multus"/"multus-daemon-config": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"multus-daemon-config": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.228132 4720 reflector.go:484] object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert": watch of *v1.Secret ended with: very short watch: object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.228150 4720 reflector.go:484] object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": watch of *v1.Secret ended with: very short watch: object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.228168 4720 reflector.go:484] object-"openshift-multus"/"cni-copy-resources": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"cni-copy-resources": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.233102 4720 reflector.go:484] object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq": watch of *v1.Secret ended with: very short watch: object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.233128 4720 reflector.go:484] object-"openshift-dns"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-dns"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.233141 4720 reflector.go:484] object-"openshift-dns"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-dns"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.233295 4720 reflector.go:484] object-"openshift-multus"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.233545 4720 reflector.go:484] object-"openshift-machine-config-operator"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-machine-config-operator"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.233834 4720 reflector.go:484] object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl": watch of *v1.Secret ended with: very short watch: object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.234008 4720 reflector.go:484] object-"openshift-ovn-kubernetes"/"env-overrides": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-ovn-kubernetes"/"env-overrides": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.234045 4720 reflector.go:484] object-"openshift-ovn-kubernetes"/"ovnkube-config": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-ovn-kubernetes"/"ovnkube-config": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.234079 4720 reflector.go:484] object-"openshift-machine-config-operator"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-machine-config-operator"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.234135 4720 reflector.go:484] object-"openshift-ovn-kubernetes"/"ovnkube-script-lib": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-ovn-kubernetes"/"ovnkube-script-lib": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.234157 4720 reflector.go:484] object-"openshift-multus"/"default-cni-sysctl-allowlist": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"default-cni-sysctl-allowlist": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.234532 4720 reflector.go:484] object-"openshift-multus"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.234610 4720 reflector.go:484] object-"openshift-machine-config-operator"/"proxy-tls": watch of *v1.Secret ended with: very short watch: object-"openshift-machine-config-operator"/"proxy-tls": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.228044 4720 reflector.go:484] object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz": watch of *v1.Secret ended with: very short watch: object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.304746 4720 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 18:11:14.422962604 +0000 UTC Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.378920 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.404207 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.409288 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0298c3b240af732b45430e55df1591c7995466acad5aa3551a12a74b6f7b06f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7768c6c697364cf9f063ee60e27385f44ea0f734c4c908282d9643056742b93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.441327 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.474388 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.486990 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1128ddd-06c2-4255-aa17-b62aa0f8a996\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm9d5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm9d5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2pbsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.564280 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvf2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvf2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvf2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvf2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvf2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvf2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvf2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvf2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvf2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zr5bd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.598015 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.629012 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://375c1a86c17f5272e6bc4717537fe63a4c86280b1a292edd1676035544d5c4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.679591 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.679704 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: E0121 14:29:42.679763 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.722745 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.753836 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5r9wf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14cdc412-e60b-4b9b-b37d-33b1f061f44d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8fhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8fhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8fhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8fhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8fhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8fhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8fhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5r9wf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.777570 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w85dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz2xr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w85dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.795279 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ef0f5a17ed649013616aa7b2d7e65b892f61c734ceec9ad1d7443d10876af78e"} Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.799863 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-k4qfb" event={"ID":"d24af441-df03-462d-914a-165777766cf4","Type":"ContainerStarted","Data":"c3482bd848f0a6c664ad2fc29e050d1dd9f33ac0c4233db1f9c778f22060b4b7"} Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.801379 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-w85dm" event={"ID":"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6","Type":"ContainerStarted","Data":"3df2e65ca3b78094d1f1a647b130e272d7eff6699626e3dace56d3c8488f9d61"} Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.801414 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-w85dm" event={"ID":"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6","Type":"ContainerStarted","Data":"a6517f927ed93f2b097ba1c364eef63a89682d8194fb196d8a1b774255191b79"} Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.804048 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerStarted","Data":"e987849aae01f0808d10da2fa7a849ecf45678e5acbdff5e43105398fd5e192a"} Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.804082 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerStarted","Data":"926a9b75c9fc74a93dd69c62eb765f3cdb4aeaf1bc918f7c3dc8f79011404240"} Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.804093 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerStarted","Data":"c77903ace690b2415d0afa0b85b0180f68914da3564489e726e04c99d3930f4b"} Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.805192 4720 generic.go:334] "Generic (PLEG): container finished" podID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerID="3644c0d176037d9e6f92e84047317c19d376ed8369e5503be68c782d1277bff2" exitCode=0 Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.805241 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" event={"ID":"ac61c15b-6fe9-4c83-9ca7-588095ab1a29","Type":"ContainerDied","Data":"3644c0d176037d9e6f92e84047317c19d376ed8369e5503be68c782d1277bff2"} Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.805256 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" event={"ID":"ac61c15b-6fe9-4c83-9ca7-588095ab1a29","Type":"ContainerStarted","Data":"ca1757282192974108b64124881bc36690fc3400e42954719815f361ddc7c63e"} Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.808617 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ab4db5b-b639-46bf-b87c-053109420c2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b66da070ca2e79fa6013abc854fba85856cc805366bb90b35c7c41f38c4ee362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0bce3b2637abe56352f67fde03ad8f25f3e40b810255de8fe4eb60361bae579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a61cfef1d25e81c7dd55aaf21e827449c2eb622097df0d924e5e2dfebc41d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7607a1d64a04f656ff3b4f81c9b30350791bdc08b7a909de30e247eee4422dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.823396 4720 generic.go:334] "Generic (PLEG): container finished" podID="14cdc412-e60b-4b9b-b37d-33b1f061f44d" containerID="c51909f5d4fb259c3b11d7389e46f2e135ead03f40bfbbd5807e38246de5e808" exitCode=0 Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.823912 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5r9wf" event={"ID":"14cdc412-e60b-4b9b-b37d-33b1f061f44d","Type":"ContainerDied","Data":"c51909f5d4fb259c3b11d7389e46f2e135ead03f40bfbbd5807e38246de5e808"} Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.823940 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5r9wf" event={"ID":"14cdc412-e60b-4b9b-b37d-33b1f061f44d","Type":"ContainerStarted","Data":"6d28cafbb02a492ec5764de06716b90abe724a20b5f175b9b2a9240735e637ee"} Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.825834 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24af441-df03-462d-914a-165777766cf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vq9z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: E0121 14:29:42.838937 4720 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.852926 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e4ae1e-77b8-40b8-9f64-1eba5a39188a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://696c35d112b31d6eeef69581faf96012141e7e530d66d1fc6e359dbaa43d115d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fa84d74fb16eedfc1ec6d2bbf04ec9d74b586ed9aa369ad2f15938af4d956c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f82adfc2ca5b7537bfd8bdf7682f4ec9c8303dcf4e2f9cf82d3523d2d2f26f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a142cc64cbb49f253527712c0d0f4fb3537591d888594bcc703d033b47f67b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f11ff07fdbcacd193207581f957a529571688ff0d97b7a999b16778f106da754\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"le observer\\\\nW0121 14:29:39.836718 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 14:29:39.836860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 14:29:39.841807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029814202/tls.crt::/tmp/serving-cert-4029814202/tls.key\\\\\\\"\\\\nI0121 14:29:40.119738 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 14:29:40.121790 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 14:29:40.121812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 14:29:40.121864 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 14:29:40.121871 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 14:29:40.137231 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 14:29:40.137263 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 14:29:40.137268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 14:29:40.137274 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 14:29:40.137279 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 14:29:40.137282 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 14:29:40.137286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 14:29:40.137463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 14:29:40.138686 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07909e8fcad66d4eb5e5d5970161495fba15d03881bec826cc68c42b6e39a179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T14:29:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.895226 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e4ae1e-77b8-40b8-9f64-1eba5a39188a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://696c35d112b31d6eeef69581faf96012141e7e530d66d1fc6e359dbaa43d115d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fa84d74fb16eedfc1ec6d2bbf04ec9d74b586ed9aa369ad2f15938af4d956c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f82adfc2ca5b7537bfd8bdf7682f4ec9c8303dcf4e2f9cf82d3523d2d2f26f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a142cc64cbb49f253527712c0d0f4fb3537591d888594bcc703d033b47f67b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f11ff07fdbcacd193207581f957a529571688ff0d97b7a999b16778f106da754\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"le observer\\\\nW0121 14:29:39.836718 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 14:29:39.836860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 14:29:39.841807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029814202/tls.crt::/tmp/serving-cert-4029814202/tls.key\\\\\\\"\\\\nI0121 14:29:40.119738 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 14:29:40.121790 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 14:29:40.121812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 14:29:40.121864 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 14:29:40.121871 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 14:29:40.137231 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 14:29:40.137263 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 14:29:40.137268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 14:29:40.137274 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 14:29:40.137279 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 14:29:40.137282 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 14:29:40.137286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 14:29:40.137463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 14:29:40.138686 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07909e8fcad66d4eb5e5d5970161495fba15d03881bec826cc68c42b6e39a179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T14:29:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.924627 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24af441-df03-462d-914a-165777766cf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3482bd848f0a6c664ad2fc29e050d1dd9f33ac0c4233db1f9c778f22060b4b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vq9z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.960828 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f5a17ed649013616aa7b2d7e65b892f61c734ceec9ad1d7443d10876af78e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.010074 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.013330 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.013364 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.013372 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.013457 4720 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.030796 4720 kubelet_node_status.go:115] "Node was previously registered" node="crc" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.031126 4720 kubelet_node_status.go:79] "Successfully registered node" node="crc" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.032199 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.032240 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.032253 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.032269 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.032282 4720 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T14:29:43Z","lastTransitionTime":"2026-01-21T14:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.079098 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podStartSLOduration=2.079082981 podStartE2EDuration="2.079082981s" podCreationTimestamp="2026-01-21 14:29:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:29:43.078950818 +0000 UTC m=+20.987690770" watchObservedRunningTime="2026-01-21 14:29:43.079082981 +0000 UTC m=+20.987822903" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.098421 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.102269 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc"] Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.102582 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.104451 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.104680 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.104822 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.104952 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.142274 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.153807 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e6f177bb-4eff-4b46-bc6b-0712b4b787ac-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-hjmcc\" (UID: \"e6f177bb-4eff-4b46-bc6b-0712b4b787ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.153855 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6f177bb-4eff-4b46-bc6b-0712b4b787ac-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-hjmcc\" (UID: \"e6f177bb-4eff-4b46-bc6b-0712b4b787ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.153891 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e6f177bb-4eff-4b46-bc6b-0712b4b787ac-service-ca\") pod \"cluster-version-operator-5c965bbfc6-hjmcc\" (UID: \"e6f177bb-4eff-4b46-bc6b-0712b4b787ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.153916 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e6f177bb-4eff-4b46-bc6b-0712b4b787ac-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-hjmcc\" (UID: \"e6f177bb-4eff-4b46-bc6b-0712b4b787ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.153959 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e6f177bb-4eff-4b46-bc6b-0712b4b787ac-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-hjmcc\" (UID: \"e6f177bb-4eff-4b46-bc6b-0712b4b787ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.167894 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=1.167876442 podStartE2EDuration="1.167876442s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:29:43.167302125 +0000 UTC m=+21.076042057" watchObservedRunningTime="2026-01-21 14:29:43.167876442 +0000 UTC m=+21.076616384" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.186502 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.214294 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=3.214271223 podStartE2EDuration="3.214271223s" podCreationTimestamp="2026-01-21 14:29:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:29:43.189441576 +0000 UTC m=+21.098181528" watchObservedRunningTime="2026-01-21 14:29:43.214271223 +0000 UTC m=+21.123011175" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.236653 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.254641 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e6f177bb-4eff-4b46-bc6b-0712b4b787ac-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-hjmcc\" (UID: \"e6f177bb-4eff-4b46-bc6b-0712b4b787ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.254713 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e6f177bb-4eff-4b46-bc6b-0712b4b787ac-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-hjmcc\" (UID: \"e6f177bb-4eff-4b46-bc6b-0712b4b787ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.254745 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e6f177bb-4eff-4b46-bc6b-0712b4b787ac-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-hjmcc\" (UID: \"e6f177bb-4eff-4b46-bc6b-0712b4b787ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.254760 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6f177bb-4eff-4b46-bc6b-0712b4b787ac-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-hjmcc\" (UID: \"e6f177bb-4eff-4b46-bc6b-0712b4b787ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.254779 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e6f177bb-4eff-4b46-bc6b-0712b4b787ac-service-ca\") pod \"cluster-version-operator-5c965bbfc6-hjmcc\" (UID: \"e6f177bb-4eff-4b46-bc6b-0712b4b787ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.255449 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e6f177bb-4eff-4b46-bc6b-0712b4b787ac-service-ca\") pod \"cluster-version-operator-5c965bbfc6-hjmcc\" (UID: \"e6f177bb-4eff-4b46-bc6b-0712b4b787ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.255719 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e6f177bb-4eff-4b46-bc6b-0712b4b787ac-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-hjmcc\" (UID: \"e6f177bb-4eff-4b46-bc6b-0712b4b787ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.255761 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e6f177bb-4eff-4b46-bc6b-0712b4b787ac-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-hjmcc\" (UID: \"e6f177bb-4eff-4b46-bc6b-0712b4b787ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.259024 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6f177bb-4eff-4b46-bc6b-0712b4b787ac-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-hjmcc\" (UID: \"e6f177bb-4eff-4b46-bc6b-0712b4b787ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.260900 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-x5ldg"] Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.261208 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-x5ldg" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.262933 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.263151 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.263532 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.264272 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.274778 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e6f177bb-4eff-4b46-bc6b-0712b4b787ac-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-hjmcc\" (UID: \"e6f177bb-4eff-4b46-bc6b-0712b4b787ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.306143 4720 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 07:45:35.116189346 +0000 UTC Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.306200 4720 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.332635 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.334046 4720 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.336719 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=3.336700396 podStartE2EDuration="3.336700396s" podCreationTimestamp="2026-01-21 14:29:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:29:43.336570542 +0000 UTC m=+21.245310494" watchObservedRunningTime="2026-01-21 14:29:43.336700396 +0000 UTC m=+21.245440328" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.337117 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-w85dm" podStartSLOduration=2.337110997 podStartE2EDuration="2.337110997s" podCreationTimestamp="2026-01-21 14:29:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:29:43.299192604 +0000 UTC m=+21.207932546" watchObservedRunningTime="2026-01-21 14:29:43.337110997 +0000 UTC m=+21.245850929" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.355575 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/790bf9ea-decc-4a7a-b349-bf7358d50842-host\") pod \"node-ca-x5ldg\" (UID: \"790bf9ea-decc-4a7a-b349-bf7358d50842\") " pod="openshift-image-registry/node-ca-x5ldg" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.355613 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/790bf9ea-decc-4a7a-b349-bf7358d50842-serviceca\") pod \"node-ca-x5ldg\" (UID: \"790bf9ea-decc-4a7a-b349-bf7358d50842\") " pod="openshift-image-registry/node-ca-x5ldg" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.355638 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2247n\" (UniqueName: \"kubernetes.io/projected/790bf9ea-decc-4a7a-b349-bf7358d50842-kube-api-access-2247n\") pod \"node-ca-x5ldg\" (UID: \"790bf9ea-decc-4a7a-b349-bf7358d50842\") " pod="openshift-image-registry/node-ca-x5ldg" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.366510 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-k4qfb" podStartSLOduration=2.366493261 podStartE2EDuration="2.366493261s" podCreationTimestamp="2026-01-21 14:29:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:29:43.36536749 +0000 UTC m=+21.274107422" watchObservedRunningTime="2026-01-21 14:29:43.366493261 +0000 UTC m=+21.275233193" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.412153 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.414060 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.418550 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.427427 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.445625 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.456160 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/790bf9ea-decc-4a7a-b349-bf7358d50842-host\") pod \"node-ca-x5ldg\" (UID: \"790bf9ea-decc-4a7a-b349-bf7358d50842\") " pod="openshift-image-registry/node-ca-x5ldg" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.456200 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/790bf9ea-decc-4a7a-b349-bf7358d50842-serviceca\") pod \"node-ca-x5ldg\" (UID: \"790bf9ea-decc-4a7a-b349-bf7358d50842\") " pod="openshift-image-registry/node-ca-x5ldg" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.456219 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2247n\" (UniqueName: \"kubernetes.io/projected/790bf9ea-decc-4a7a-b349-bf7358d50842-kube-api-access-2247n\") pod \"node-ca-x5ldg\" (UID: \"790bf9ea-decc-4a7a-b349-bf7358d50842\") " pod="openshift-image-registry/node-ca-x5ldg" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.456479 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/790bf9ea-decc-4a7a-b349-bf7358d50842-host\") pod \"node-ca-x5ldg\" (UID: \"790bf9ea-decc-4a7a-b349-bf7358d50842\") " pod="openshift-image-registry/node-ca-x5ldg" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.457273 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/790bf9ea-decc-4a7a-b349-bf7358d50842-serviceca\") pod \"node-ca-x5ldg\" (UID: \"790bf9ea-decc-4a7a-b349-bf7358d50842\") " pod="openshift-image-registry/node-ca-x5ldg" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.470306 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.481296 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2247n\" (UniqueName: \"kubernetes.io/projected/790bf9ea-decc-4a7a-b349-bf7358d50842-kube-api-access-2247n\") pod \"node-ca-x5ldg\" (UID: \"790bf9ea-decc-4a7a-b349-bf7358d50842\") " pod="openshift-image-registry/node-ca-x5ldg" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.547823 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.560701 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sdvtx"] Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.561075 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sdvtx" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.562830 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.563165 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.572329 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-x5ldg" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.586949 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 21 14:29:43 crc kubenswrapper[4720]: W0121 14:29:43.591751 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod790bf9ea_decc_4a7a_b349_bf7358d50842.slice/crio-3eb5b5e8d7905bfe5fb6b9ac2299a87f1ec86126ebf1c77d4e712a04bf1c45ca WatchSource:0}: Error finding container 3eb5b5e8d7905bfe5fb6b9ac2299a87f1ec86126ebf1c77d4e712a04bf1c45ca: Status 404 returned error can't find the container with id 3eb5b5e8d7905bfe5fb6b9ac2299a87f1ec86126ebf1c77d4e712a04bf1c45ca Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.596173 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-x48m6"] Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.596633 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x48m6" Jan 21 14:29:43 crc kubenswrapper[4720]: E0121 14:29:43.596717 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x48m6" podUID="139c8416-e015-49e4-adfe-32f9e142621f" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.608019 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.647401 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.658159 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5mq7\" (UniqueName: \"kubernetes.io/projected/139c8416-e015-49e4-adfe-32f9e142621f-kube-api-access-m5mq7\") pod \"network-metrics-daemon-x48m6\" (UID: \"139c8416-e015-49e4-adfe-32f9e142621f\") " pod="openshift-multus/network-metrics-daemon-x48m6" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.658355 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lfqm\" (UniqueName: \"kubernetes.io/projected/b6c8f4e3-ac08-4482-b686-a4b1618e051d-kube-api-access-2lfqm\") pod \"ovnkube-control-plane-749d76644c-sdvtx\" (UID: \"b6c8f4e3-ac08-4482-b686-a4b1618e051d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sdvtx" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.658453 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/139c8416-e015-49e4-adfe-32f9e142621f-metrics-certs\") pod \"network-metrics-daemon-x48m6\" (UID: \"139c8416-e015-49e4-adfe-32f9e142621f\") " pod="openshift-multus/network-metrics-daemon-x48m6" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.658566 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b6c8f4e3-ac08-4482-b686-a4b1618e051d-env-overrides\") pod \"ovnkube-control-plane-749d76644c-sdvtx\" (UID: \"b6c8f4e3-ac08-4482-b686-a4b1618e051d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sdvtx" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.658677 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b6c8f4e3-ac08-4482-b686-a4b1618e051d-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-sdvtx\" (UID: \"b6c8f4e3-ac08-4482-b686-a4b1618e051d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sdvtx" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.658799 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b6c8f4e3-ac08-4482-b686-a4b1618e051d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-sdvtx\" (UID: \"b6c8f4e3-ac08-4482-b686-a4b1618e051d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sdvtx" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.677243 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.677286 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 14:29:43 crc kubenswrapper[4720]: E0121 14:29:43.677356 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 14:29:43 crc kubenswrapper[4720]: E0121 14:29:43.677710 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.687395 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.708729 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.728116 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.750034 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.759422 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b6c8f4e3-ac08-4482-b686-a4b1618e051d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-sdvtx\" (UID: \"b6c8f4e3-ac08-4482-b686-a4b1618e051d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sdvtx" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.759497 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lfqm\" (UniqueName: \"kubernetes.io/projected/b6c8f4e3-ac08-4482-b686-a4b1618e051d-kube-api-access-2lfqm\") pod \"ovnkube-control-plane-749d76644c-sdvtx\" (UID: \"b6c8f4e3-ac08-4482-b686-a4b1618e051d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sdvtx" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.759523 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5mq7\" (UniqueName: \"kubernetes.io/projected/139c8416-e015-49e4-adfe-32f9e142621f-kube-api-access-m5mq7\") pod \"network-metrics-daemon-x48m6\" (UID: \"139c8416-e015-49e4-adfe-32f9e142621f\") " pod="openshift-multus/network-metrics-daemon-x48m6" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.759551 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/139c8416-e015-49e4-adfe-32f9e142621f-metrics-certs\") pod \"network-metrics-daemon-x48m6\" (UID: \"139c8416-e015-49e4-adfe-32f9e142621f\") " pod="openshift-multus/network-metrics-daemon-x48m6" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.759581 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b6c8f4e3-ac08-4482-b686-a4b1618e051d-env-overrides\") pod \"ovnkube-control-plane-749d76644c-sdvtx\" (UID: \"b6c8f4e3-ac08-4482-b686-a4b1618e051d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sdvtx" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.759602 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b6c8f4e3-ac08-4482-b686-a4b1618e051d-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-sdvtx\" (UID: \"b6c8f4e3-ac08-4482-b686-a4b1618e051d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sdvtx" Jan 21 14:29:43 crc kubenswrapper[4720]: E0121 14:29:43.759841 4720 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 14:29:43 crc kubenswrapper[4720]: E0121 14:29:43.759896 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/139c8416-e015-49e4-adfe-32f9e142621f-metrics-certs podName:139c8416-e015-49e4-adfe-32f9e142621f nodeName:}" failed. No retries permitted until 2026-01-21 14:29:44.259881043 +0000 UTC m=+22.168621015 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/139c8416-e015-49e4-adfe-32f9e142621f-metrics-certs") pod "network-metrics-daemon-x48m6" (UID: "139c8416-e015-49e4-adfe-32f9e142621f") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.760736 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b6c8f4e3-ac08-4482-b686-a4b1618e051d-env-overrides\") pod \"ovnkube-control-plane-749d76644c-sdvtx\" (UID: \"b6c8f4e3-ac08-4482-b686-a4b1618e051d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sdvtx" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.760748 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b6c8f4e3-ac08-4482-b686-a4b1618e051d-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-sdvtx\" (UID: \"b6c8f4e3-ac08-4482-b686-a4b1618e051d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sdvtx" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.765688 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b6c8f4e3-ac08-4482-b686-a4b1618e051d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-sdvtx\" (UID: \"b6c8f4e3-ac08-4482-b686-a4b1618e051d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sdvtx" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.784497 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5mq7\" (UniqueName: \"kubernetes.io/projected/139c8416-e015-49e4-adfe-32f9e142621f-kube-api-access-m5mq7\") pod \"network-metrics-daemon-x48m6\" (UID: \"139c8416-e015-49e4-adfe-32f9e142621f\") " pod="openshift-multus/network-metrics-daemon-x48m6" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.800544 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lfqm\" (UniqueName: \"kubernetes.io/projected/b6c8f4e3-ac08-4482-b686-a4b1618e051d-kube-api-access-2lfqm\") pod \"ovnkube-control-plane-749d76644c-sdvtx\" (UID: \"b6c8f4e3-ac08-4482-b686-a4b1618e051d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sdvtx" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.807195 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.827053 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.830059 4720 generic.go:334] "Generic (PLEG): container finished" podID="14cdc412-e60b-4b9b-b37d-33b1f061f44d" containerID="33c8d4e7303bd3b7659ada685627eec03fed3192711e42d04c1b4ba547abb7d7" exitCode=0 Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.830150 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5r9wf" event={"ID":"14cdc412-e60b-4b9b-b37d-33b1f061f44d","Type":"ContainerDied","Data":"33c8d4e7303bd3b7659ada685627eec03fed3192711e42d04c1b4ba547abb7d7"} Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.832076 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-x5ldg" event={"ID":"790bf9ea-decc-4a7a-b349-bf7358d50842","Type":"ContainerStarted","Data":"3eb5b5e8d7905bfe5fb6b9ac2299a87f1ec86126ebf1c77d4e712a04bf1c45ca"} Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.835439 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc" event={"ID":"e6f177bb-4eff-4b46-bc6b-0712b4b787ac","Type":"ContainerStarted","Data":"dc9f16dddb9a855e83cdad9b82369b603c8b6a1148856dd1c05e19ff3e26d54f"} Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.842012 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" event={"ID":"ac61c15b-6fe9-4c83-9ca7-588095ab1a29","Type":"ContainerStarted","Data":"3ec88cc895f2b82eea19c4a927605f7196f008760619664f069a2d889cbe5c5a"} Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.842049 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" event={"ID":"ac61c15b-6fe9-4c83-9ca7-588095ab1a29","Type":"ContainerStarted","Data":"4056ab2c599299e02ae3a697f4141c19f2ef8508967564b6eb65b03665cc2136"} Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.842061 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" event={"ID":"ac61c15b-6fe9-4c83-9ca7-588095ab1a29","Type":"ContainerStarted","Data":"259075e5d56293bfdd3c160453e38651da3e67e117efc2e3c9012ac4918c0bf9"} Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.842072 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" event={"ID":"ac61c15b-6fe9-4c83-9ca7-588095ab1a29","Type":"ContainerStarted","Data":"aae9671a8f7e4c811bae63647d082185a7a92f1832f12a8387956c7bb6aab556"} Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.847328 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.867775 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.960787 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:43 crc kubenswrapper[4720]: E0121 14:29:43.960927 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:29:47.960901921 +0000 UTC m=+25.869641863 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.961136 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.961242 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.961277 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.961329 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:43 crc kubenswrapper[4720]: E0121 14:29:43.961750 4720 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 14:29:43 crc kubenswrapper[4720]: E0121 14:29:43.961805 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 14:29:47.961794136 +0000 UTC m=+25.870534078 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 14:29:43 crc kubenswrapper[4720]: E0121 14:29:43.962543 4720 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 14:29:43 crc kubenswrapper[4720]: E0121 14:29:43.962568 4720 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 14:29:43 crc kubenswrapper[4720]: E0121 14:29:43.962582 4720 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 14:29:43 crc kubenswrapper[4720]: E0121 14:29:43.962620 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-21 14:29:47.962606158 +0000 UTC m=+25.871346100 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 14:29:43 crc kubenswrapper[4720]: E0121 14:29:43.963632 4720 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 14:29:43 crc kubenswrapper[4720]: E0121 14:29:43.963705 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 14:29:47.963693689 +0000 UTC m=+25.872433631 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 14:29:43 crc kubenswrapper[4720]: E0121 14:29:43.964259 4720 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 14:29:43 crc kubenswrapper[4720]: E0121 14:29:43.964288 4720 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 14:29:43 crc kubenswrapper[4720]: E0121 14:29:43.964299 4720 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 14:29:43 crc kubenswrapper[4720]: E0121 14:29:43.964374 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-21 14:29:47.964357657 +0000 UTC m=+25.873097589 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 14:29:44 crc kubenswrapper[4720]: I0121 14:29:44.075708 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sdvtx" Jan 21 14:29:44 crc kubenswrapper[4720]: W0121 14:29:44.098924 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6c8f4e3_ac08_4482_b686_a4b1618e051d.slice/crio-6140416b33ea797306d905b7a77da994c647560417081684c7ca5b22ff2dfe70 WatchSource:0}: Error finding container 6140416b33ea797306d905b7a77da994c647560417081684c7ca5b22ff2dfe70: Status 404 returned error can't find the container with id 6140416b33ea797306d905b7a77da994c647560417081684c7ca5b22ff2dfe70 Jan 21 14:29:44 crc kubenswrapper[4720]: I0121 14:29:44.265726 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/139c8416-e015-49e4-adfe-32f9e142621f-metrics-certs\") pod \"network-metrics-daemon-x48m6\" (UID: \"139c8416-e015-49e4-adfe-32f9e142621f\") " pod="openshift-multus/network-metrics-daemon-x48m6" Jan 21 14:29:44 crc kubenswrapper[4720]: E0121 14:29:44.265832 4720 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 14:29:44 crc kubenswrapper[4720]: E0121 14:29:44.265875 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/139c8416-e015-49e4-adfe-32f9e142621f-metrics-certs podName:139c8416-e015-49e4-adfe-32f9e142621f nodeName:}" failed. No retries permitted until 2026-01-21 14:29:45.265861243 +0000 UTC m=+23.174601175 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/139c8416-e015-49e4-adfe-32f9e142621f-metrics-certs") pod "network-metrics-daemon-x48m6" (UID: "139c8416-e015-49e4-adfe-32f9e142621f") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 14:29:44 crc kubenswrapper[4720]: I0121 14:29:44.677204 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:44 crc kubenswrapper[4720]: E0121 14:29:44.677616 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 14:29:44 crc kubenswrapper[4720]: I0121 14:29:44.850774 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc" event={"ID":"e6f177bb-4eff-4b46-bc6b-0712b4b787ac","Type":"ContainerStarted","Data":"2ea9fd8cccf96a7a49117394d251bbe2af6588e5015076c31ed1c46098fcf8c7"} Jan 21 14:29:44 crc kubenswrapper[4720]: I0121 14:29:44.853881 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" event={"ID":"ac61c15b-6fe9-4c83-9ca7-588095ab1a29","Type":"ContainerStarted","Data":"d1e39c6489267605567083bbf58d52dc83783ab46cf6468a4a619fdd22a893c0"} Jan 21 14:29:44 crc kubenswrapper[4720]: I0121 14:29:44.855799 4720 generic.go:334] "Generic (PLEG): container finished" podID="14cdc412-e60b-4b9b-b37d-33b1f061f44d" containerID="09018c7b53b0ac3ffa11afab29d27968cab8b22bd7df3ef7ac95f773b2bca6c5" exitCode=0 Jan 21 14:29:44 crc kubenswrapper[4720]: I0121 14:29:44.855870 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5r9wf" event={"ID":"14cdc412-e60b-4b9b-b37d-33b1f061f44d","Type":"ContainerDied","Data":"09018c7b53b0ac3ffa11afab29d27968cab8b22bd7df3ef7ac95f773b2bca6c5"} Jan 21 14:29:44 crc kubenswrapper[4720]: I0121 14:29:44.857068 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-x5ldg" event={"ID":"790bf9ea-decc-4a7a-b349-bf7358d50842","Type":"ContainerStarted","Data":"d5d7991d26c12ced440dcb34bb4789ce5ba5f14b99bdda7210328b02094ca76d"} Jan 21 14:29:44 crc kubenswrapper[4720]: I0121 14:29:44.861985 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sdvtx" event={"ID":"b6c8f4e3-ac08-4482-b686-a4b1618e051d","Type":"ContainerStarted","Data":"4d7058ee5b191ba3e2eda1651d9fbfe809e9755faa96a201cefe8834b47387b9"} Jan 21 14:29:44 crc kubenswrapper[4720]: I0121 14:29:44.862035 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sdvtx" event={"ID":"b6c8f4e3-ac08-4482-b686-a4b1618e051d","Type":"ContainerStarted","Data":"6140416b33ea797306d905b7a77da994c647560417081684c7ca5b22ff2dfe70"} Jan 21 14:29:44 crc kubenswrapper[4720]: I0121 14:29:44.865543 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc" podStartSLOduration=3.865524299 podStartE2EDuration="3.865524299s" podCreationTimestamp="2026-01-21 14:29:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:29:44.864314765 +0000 UTC m=+22.773054717" watchObservedRunningTime="2026-01-21 14:29:44.865524299 +0000 UTC m=+22.774264241" Jan 21 14:29:44 crc kubenswrapper[4720]: I0121 14:29:44.907435 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-x5ldg" podStartSLOduration=2.907413724 podStartE2EDuration="2.907413724s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:29:44.906730705 +0000 UTC m=+22.815470657" watchObservedRunningTime="2026-01-21 14:29:44.907413724 +0000 UTC m=+22.816153656" Jan 21 14:29:45 crc kubenswrapper[4720]: I0121 14:29:45.276912 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/139c8416-e015-49e4-adfe-32f9e142621f-metrics-certs\") pod \"network-metrics-daemon-x48m6\" (UID: \"139c8416-e015-49e4-adfe-32f9e142621f\") " pod="openshift-multus/network-metrics-daemon-x48m6" Jan 21 14:29:45 crc kubenswrapper[4720]: E0121 14:29:45.277110 4720 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 14:29:45 crc kubenswrapper[4720]: E0121 14:29:45.277161 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/139c8416-e015-49e4-adfe-32f9e142621f-metrics-certs podName:139c8416-e015-49e4-adfe-32f9e142621f nodeName:}" failed. No retries permitted until 2026-01-21 14:29:47.277145462 +0000 UTC m=+25.185885394 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/139c8416-e015-49e4-adfe-32f9e142621f-metrics-certs") pod "network-metrics-daemon-x48m6" (UID: "139c8416-e015-49e4-adfe-32f9e142621f") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 14:29:45 crc kubenswrapper[4720]: I0121 14:29:45.677850 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x48m6" Jan 21 14:29:45 crc kubenswrapper[4720]: I0121 14:29:45.677933 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 14:29:45 crc kubenswrapper[4720]: I0121 14:29:45.677944 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 14:29:45 crc kubenswrapper[4720]: E0121 14:29:45.677968 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x48m6" podUID="139c8416-e015-49e4-adfe-32f9e142621f" Jan 21 14:29:45 crc kubenswrapper[4720]: E0121 14:29:45.678013 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 14:29:45 crc kubenswrapper[4720]: E0121 14:29:45.678097 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 14:29:45 crc kubenswrapper[4720]: I0121 14:29:45.870664 4720 generic.go:334] "Generic (PLEG): container finished" podID="14cdc412-e60b-4b9b-b37d-33b1f061f44d" containerID="550b7d659a2217ce1d8c62b141c2b73ef539f9cc9abef756b517e7d2290bc55f" exitCode=0 Jan 21 14:29:45 crc kubenswrapper[4720]: I0121 14:29:45.870705 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5r9wf" event={"ID":"14cdc412-e60b-4b9b-b37d-33b1f061f44d","Type":"ContainerDied","Data":"550b7d659a2217ce1d8c62b141c2b73ef539f9cc9abef756b517e7d2290bc55f"} Jan 21 14:29:45 crc kubenswrapper[4720]: I0121 14:29:45.872259 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sdvtx" event={"ID":"b6c8f4e3-ac08-4482-b686-a4b1618e051d","Type":"ContainerStarted","Data":"4b529a8be01df9ff160bd230126312dcee298bbd33c9ad49582dc39b4fe7b034"} Jan 21 14:29:45 crc kubenswrapper[4720]: I0121 14:29:45.877559 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" event={"ID":"ac61c15b-6fe9-4c83-9ca7-588095ab1a29","Type":"ContainerStarted","Data":"625271ad4ff015cec75e604df3cd6c896dadaa55c6eea3a035e02ffaace285cc"} Jan 21 14:29:45 crc kubenswrapper[4720]: I0121 14:29:45.901303 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sdvtx" podStartSLOduration=3.901285176 podStartE2EDuration="3.901285176s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:29:45.899105784 +0000 UTC m=+23.807845746" watchObservedRunningTime="2026-01-21 14:29:45.901285176 +0000 UTC m=+23.810025098" Jan 21 14:29:46 crc kubenswrapper[4720]: I0121 14:29:46.677444 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:46 crc kubenswrapper[4720]: E0121 14:29:46.677569 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 14:29:46 crc kubenswrapper[4720]: I0121 14:29:46.886347 4720 generic.go:334] "Generic (PLEG): container finished" podID="14cdc412-e60b-4b9b-b37d-33b1f061f44d" containerID="c112b8b02df124c91a8f0307eb2756328ec9a65a83a3ddc4c767bc2a57c58335" exitCode=0 Jan 21 14:29:46 crc kubenswrapper[4720]: I0121 14:29:46.886426 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5r9wf" event={"ID":"14cdc412-e60b-4b9b-b37d-33b1f061f44d","Type":"ContainerDied","Data":"c112b8b02df124c91a8f0307eb2756328ec9a65a83a3ddc4c767bc2a57c58335"} Jan 21 14:29:47 crc kubenswrapper[4720]: I0121 14:29:47.296793 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/139c8416-e015-49e4-adfe-32f9e142621f-metrics-certs\") pod \"network-metrics-daemon-x48m6\" (UID: \"139c8416-e015-49e4-adfe-32f9e142621f\") " pod="openshift-multus/network-metrics-daemon-x48m6" Jan 21 14:29:47 crc kubenswrapper[4720]: E0121 14:29:47.296934 4720 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 14:29:47 crc kubenswrapper[4720]: E0121 14:29:47.296986 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/139c8416-e015-49e4-adfe-32f9e142621f-metrics-certs podName:139c8416-e015-49e4-adfe-32f9e142621f nodeName:}" failed. No retries permitted until 2026-01-21 14:29:51.296972885 +0000 UTC m=+29.205712817 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/139c8416-e015-49e4-adfe-32f9e142621f-metrics-certs") pod "network-metrics-daemon-x48m6" (UID: "139c8416-e015-49e4-adfe-32f9e142621f") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 14:29:47 crc kubenswrapper[4720]: I0121 14:29:47.677911 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x48m6" Jan 21 14:29:47 crc kubenswrapper[4720]: I0121 14:29:47.677943 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 14:29:47 crc kubenswrapper[4720]: I0121 14:29:47.677907 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 14:29:47 crc kubenswrapper[4720]: E0121 14:29:47.678034 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 14:29:47 crc kubenswrapper[4720]: E0121 14:29:47.678167 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 14:29:47 crc kubenswrapper[4720]: E0121 14:29:47.678259 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x48m6" podUID="139c8416-e015-49e4-adfe-32f9e142621f" Jan 21 14:29:47 crc kubenswrapper[4720]: I0121 14:29:47.899345 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" event={"ID":"ac61c15b-6fe9-4c83-9ca7-588095ab1a29","Type":"ContainerStarted","Data":"cc403e7370f4c0e180578f7c937d1c540b58b911d0dcde263edb0332846542ff"} Jan 21 14:29:47 crc kubenswrapper[4720]: I0121 14:29:47.904961 4720 generic.go:334] "Generic (PLEG): container finished" podID="14cdc412-e60b-4b9b-b37d-33b1f061f44d" containerID="cf8224219befa4cd9404148c59cc21d712fed48c71e842ff810a2f9df97e9301" exitCode=0 Jan 21 14:29:47 crc kubenswrapper[4720]: I0121 14:29:47.905005 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5r9wf" event={"ID":"14cdc412-e60b-4b9b-b37d-33b1f061f44d","Type":"ContainerDied","Data":"cf8224219befa4cd9404148c59cc21d712fed48c71e842ff810a2f9df97e9301"} Jan 21 14:29:48 crc kubenswrapper[4720]: I0121 14:29:48.003922 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:48 crc kubenswrapper[4720]: I0121 14:29:48.004038 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 14:29:48 crc kubenswrapper[4720]: I0121 14:29:48.004063 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 14:29:48 crc kubenswrapper[4720]: E0121 14:29:48.004137 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:29:56.004109686 +0000 UTC m=+33.912849628 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:48 crc kubenswrapper[4720]: E0121 14:29:48.004172 4720 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 14:29:48 crc kubenswrapper[4720]: E0121 14:29:48.004217 4720 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 14:29:48 crc kubenswrapper[4720]: E0121 14:29:48.004235 4720 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 14:29:48 crc kubenswrapper[4720]: E0121 14:29:48.004305 4720 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 14:29:48 crc kubenswrapper[4720]: I0121 14:29:48.004228 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:48 crc kubenswrapper[4720]: E0121 14:29:48.004310 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-21 14:29:56.004299511 +0000 UTC m=+33.913039553 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 14:29:48 crc kubenswrapper[4720]: E0121 14:29:48.004374 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 14:29:56.004358123 +0000 UTC m=+33.913098125 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 14:29:48 crc kubenswrapper[4720]: I0121 14:29:48.004416 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:48 crc kubenswrapper[4720]: E0121 14:29:48.004534 4720 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 14:29:48 crc kubenswrapper[4720]: E0121 14:29:48.004577 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 14:29:56.004566438 +0000 UTC m=+33.913306440 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 14:29:48 crc kubenswrapper[4720]: E0121 14:29:48.004583 4720 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 14:29:48 crc kubenswrapper[4720]: E0121 14:29:48.004601 4720 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 14:29:48 crc kubenswrapper[4720]: E0121 14:29:48.004614 4720 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 14:29:48 crc kubenswrapper[4720]: E0121 14:29:48.004691 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-21 14:29:56.004679971 +0000 UTC m=+33.913419963 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 14:29:48 crc kubenswrapper[4720]: I0121 14:29:48.678149 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:48 crc kubenswrapper[4720]: E0121 14:29:48.678479 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 14:29:48 crc kubenswrapper[4720]: I0121 14:29:48.911229 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5r9wf" event={"ID":"14cdc412-e60b-4b9b-b37d-33b1f061f44d","Type":"ContainerStarted","Data":"dcbe475b94875f9187e3876f749e33c56cdbc98bbdb7843109a51d6b85182eeb"} Jan 21 14:29:48 crc kubenswrapper[4720]: I0121 14:29:48.933144 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-5r9wf" podStartSLOduration=7.933129758 podStartE2EDuration="7.933129758s" podCreationTimestamp="2026-01-21 14:29:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:29:48.930261068 +0000 UTC m=+26.839001010" watchObservedRunningTime="2026-01-21 14:29:48.933129758 +0000 UTC m=+26.841869690" Jan 21 14:29:49 crc kubenswrapper[4720]: I0121 14:29:49.677977 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 14:29:49 crc kubenswrapper[4720]: I0121 14:29:49.678003 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x48m6" Jan 21 14:29:49 crc kubenswrapper[4720]: E0121 14:29:49.678112 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 14:29:49 crc kubenswrapper[4720]: E0121 14:29:49.678281 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x48m6" podUID="139c8416-e015-49e4-adfe-32f9e142621f" Jan 21 14:29:49 crc kubenswrapper[4720]: I0121 14:29:49.678002 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 14:29:49 crc kubenswrapper[4720]: E0121 14:29:49.678565 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 14:29:49 crc kubenswrapper[4720]: I0121 14:29:49.919837 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" event={"ID":"ac61c15b-6fe9-4c83-9ca7-588095ab1a29","Type":"ContainerStarted","Data":"b87e56ea354fc6a8d23bff77c9d817ab2f0481a52e7536b33cff8e842d3acc39"} Jan 21 14:29:49 crc kubenswrapper[4720]: I0121 14:29:49.921168 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:49 crc kubenswrapper[4720]: I0121 14:29:49.921212 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:49 crc kubenswrapper[4720]: I0121 14:29:49.952172 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" podStartSLOduration=8.952153005 podStartE2EDuration="8.952153005s" podCreationTimestamp="2026-01-21 14:29:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:29:49.951308301 +0000 UTC m=+27.860048243" watchObservedRunningTime="2026-01-21 14:29:49.952153005 +0000 UTC m=+27.860892937" Jan 21 14:29:49 crc kubenswrapper[4720]: I0121 14:29:49.954429 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:49 crc kubenswrapper[4720]: I0121 14:29:49.956413 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:50 crc kubenswrapper[4720]: I0121 14:29:50.677480 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:50 crc kubenswrapper[4720]: E0121 14:29:50.677931 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 14:29:50 crc kubenswrapper[4720]: I0121 14:29:50.921989 4720 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 14:29:51 crc kubenswrapper[4720]: I0121 14:29:51.336160 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/139c8416-e015-49e4-adfe-32f9e142621f-metrics-certs\") pod \"network-metrics-daemon-x48m6\" (UID: \"139c8416-e015-49e4-adfe-32f9e142621f\") " pod="openshift-multus/network-metrics-daemon-x48m6" Jan 21 14:29:51 crc kubenswrapper[4720]: E0121 14:29:51.336336 4720 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 14:29:51 crc kubenswrapper[4720]: E0121 14:29:51.336382 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/139c8416-e015-49e4-adfe-32f9e142621f-metrics-certs podName:139c8416-e015-49e4-adfe-32f9e142621f nodeName:}" failed. No retries permitted until 2026-01-21 14:29:59.336368103 +0000 UTC m=+37.245108025 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/139c8416-e015-49e4-adfe-32f9e142621f-metrics-certs") pod "network-metrics-daemon-x48m6" (UID: "139c8416-e015-49e4-adfe-32f9e142621f") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 14:29:51 crc kubenswrapper[4720]: I0121 14:29:51.678157 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 14:29:51 crc kubenswrapper[4720]: I0121 14:29:51.678191 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x48m6" Jan 21 14:29:51 crc kubenswrapper[4720]: I0121 14:29:51.678170 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 14:29:51 crc kubenswrapper[4720]: E0121 14:29:51.678304 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 14:29:51 crc kubenswrapper[4720]: E0121 14:29:51.678417 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 14:29:51 crc kubenswrapper[4720]: E0121 14:29:51.678472 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x48m6" podUID="139c8416-e015-49e4-adfe-32f9e142621f" Jan 21 14:29:51 crc kubenswrapper[4720]: I0121 14:29:51.925107 4720 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 14:29:52 crc kubenswrapper[4720]: I0121 14:29:52.400254 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-x48m6"] Jan 21 14:29:52 crc kubenswrapper[4720]: I0121 14:29:52.400360 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x48m6" Jan 21 14:29:52 crc kubenswrapper[4720]: E0121 14:29:52.400453 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x48m6" podUID="139c8416-e015-49e4-adfe-32f9e142621f" Jan 21 14:29:52 crc kubenswrapper[4720]: I0121 14:29:52.677296 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:52 crc kubenswrapper[4720]: E0121 14:29:52.678574 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 14:29:52 crc kubenswrapper[4720]: I0121 14:29:52.982105 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:29:53 crc kubenswrapper[4720]: I0121 14:29:53.677700 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 14:29:53 crc kubenswrapper[4720]: E0121 14:29:53.677853 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 14:29:53 crc kubenswrapper[4720]: I0121 14:29:53.677970 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 14:29:53 crc kubenswrapper[4720]: E0121 14:29:53.678178 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 14:29:54 crc kubenswrapper[4720]: I0121 14:29:54.677213 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:54 crc kubenswrapper[4720]: I0121 14:29:54.677255 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x48m6" Jan 21 14:29:54 crc kubenswrapper[4720]: E0121 14:29:54.677495 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x48m6" podUID="139c8416-e015-49e4-adfe-32f9e142621f" Jan 21 14:29:54 crc kubenswrapper[4720]: E0121 14:29:54.677376 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.230621 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.231110 4720 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.264738 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-h9ckd"] Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.265409 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-h9ckd" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.265583 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9gr25"] Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.265986 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.274681 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.274910 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.275525 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh"] Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.275885 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.275960 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.276035 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.275926 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.276123 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.276310 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.276413 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.276477 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-cnk8x"] Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.276543 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.276544 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.276822 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.276978 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk"] Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.277071 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cnk8x" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.277512 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.277606 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-68kgl"] Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.277993 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-68kgl" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.279315 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.285172 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-42g76"] Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.285646 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.287559 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.287897 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.288047 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.288138 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.289984 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.290055 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-92xp4"] Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.290166 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.290450 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.290559 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-92xp4" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.291554 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-wmxb9"] Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.292010 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.292012 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-wmxb9" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.294566 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.294837 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.296073 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.296124 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.307899 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.308283 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.307988 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.308547 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.308590 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.308548 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.308796 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.308947 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.309108 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.309259 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.309005 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.309467 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.309616 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.309707 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.309385 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.309884 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nscjl"] Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.309975 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.310043 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.310073 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.310217 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.310445 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nscjl" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.310010 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.310620 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p6shd"] Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.310386 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.310410 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.310454 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.312503 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-v2pht"] Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.310956 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.311026 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.312905 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-v2pht" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.313168 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p6shd" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.317826 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.320068 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mxvkb"] Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.320688 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7xcc8"] Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.321060 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.321611 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mxvkb" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.323481 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.323744 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.326778 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pm8dm"] Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.327469 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.328322 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-zvq7p"] Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.328890 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.329073 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-zvq7p" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.331626 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-h9ckd"] Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.336477 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.338313 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh"] Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.340469 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9gr25"] Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.340725 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-92xp4"] Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.364634 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.365216 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.366614 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:55.366877 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.811635 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:55.366998 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:55.367043 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.811926 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/90b6768c-8240-4fc1-a760-59d79a3c1c02-audit-dir\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.811961 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1a75d5de-a507-41ca-8206-eae702d16020-images\") pod \"machine-api-operator-5694c8668f-h9ckd\" (UID: \"1a75d5de-a507-41ca-8206-eae702d16020\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h9ckd" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.811991 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-client-ca\") pod \"controller-manager-879f6c89f-9gr25\" (UID: \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812018 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-trusted-ca-bundle\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812056 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:55.367079 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812079 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317-config\") pod \"machine-approver-56656f9798-cnk8x\" (UID: \"b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cnk8x" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812100 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812123 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54pb6\" (UniqueName: \"kubernetes.io/projected/afb1ffca-e30f-47cf-b399-2bd057039b10-kube-api-access-54pb6\") pod \"openshift-config-operator-7777fb866f-v2pht\" (UID: \"afb1ffca-e30f-47cf-b399-2bd057039b10\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-v2pht" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812144 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812159 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812178 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/90b6768c-8240-4fc1-a760-59d79a3c1c02-audit-policies\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:55.367237 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812238 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317-auth-proxy-config\") pod \"machine-approver-56656f9798-cnk8x\" (UID: \"b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cnk8x" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812259 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-serving-cert\") pod \"route-controller-manager-6576b87f9c-bncrk\" (UID: \"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812280 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k4hm\" (UniqueName: \"kubernetes.io/projected/aa4e660f-7816-4c20-b94c-5f9543d9cbed-kube-api-access-5k4hm\") pod \"cluster-samples-operator-665b6dd947-mxvkb\" (UID: \"aa4e660f-7816-4c20-b94c-5f9543d9cbed\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mxvkb" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812296 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8848\" (UniqueName: \"kubernetes.io/projected/4a47e9b4-6318-4f71-9db0-105be2ada134-kube-api-access-j8848\") pod \"authentication-operator-69f744f599-92xp4\" (UID: \"4a47e9b4-6318-4f71-9db0-105be2ada134\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-92xp4" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812316 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-client-ca\") pod \"route-controller-manager-6576b87f9c-bncrk\" (UID: \"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812335 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1a75d5de-a507-41ca-8206-eae702d16020-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-h9ckd\" (UID: \"1a75d5de-a507-41ca-8206-eae702d16020\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h9ckd" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:55.367402 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812368 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aa2b643f-ce1f-45db-ba7f-31a5fc037650-trusted-ca\") pod \"console-operator-58897d9998-68kgl\" (UID: \"aa2b643f-ce1f-45db-ba7f-31a5fc037650\") " pod="openshift-console-operator/console-operator-58897d9998-68kgl" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812389 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a47e9b4-6318-4f71-9db0-105be2ada134-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-92xp4\" (UID: \"4a47e9b4-6318-4f71-9db0-105be2ada134\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-92xp4" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812409 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tf9q\" (UniqueName: \"kubernetes.io/projected/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-kube-api-access-6tf9q\") pod \"route-controller-manager-6576b87f9c-bncrk\" (UID: \"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812428 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-oauth-serving-cert\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812446 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812468 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9gr25\" (UID: \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812488 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzt2f\" (UniqueName: \"kubernetes.io/projected/ac15d591-5558-4df9-b596-a1e27325bd6c-kube-api-access-nzt2f\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:55.367447 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812507 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812524 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61315eef-fa85-4828-9668-f6f4b1484453-config\") pod \"openshift-apiserver-operator-796bbdcf4f-p6shd\" (UID: \"61315eef-fa85-4828-9668-f6f4b1484453\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p6shd" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812548 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afb1ffca-e30f-47cf-b399-2bd057039b10-serving-cert\") pod \"openshift-config-operator-7777fb866f-v2pht\" (UID: \"afb1ffca-e30f-47cf-b399-2bd057039b10\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-v2pht" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812567 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90b6768c-8240-4fc1-a760-59d79a3c1c02-serving-cert\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812586 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s95x9\" (UniqueName: \"kubernetes.io/projected/b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317-kube-api-access-s95x9\") pod \"machine-approver-56656f9798-cnk8x\" (UID: \"b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cnk8x" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812604 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-console-config\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812620 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-audit-dir\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:55.367493 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812675 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/afb1ffca-e30f-47cf-b399-2bd057039b10-available-featuregates\") pod \"openshift-config-operator-7777fb866f-v2pht\" (UID: \"afb1ffca-e30f-47cf-b399-2bd057039b10\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-v2pht" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812708 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812736 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812764 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-config\") pod \"route-controller-manager-6576b87f9c-bncrk\" (UID: \"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812789 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac15d591-5558-4df9-b596-a1e27325bd6c-console-serving-cert\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812816 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-serving-cert\") pod \"controller-manager-879f6c89f-9gr25\" (UID: \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812848 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxzpl\" (UniqueName: \"kubernetes.io/projected/07f01852-61b7-4eee-acd6-3d8b8e2b1c85-kube-api-access-wxzpl\") pod \"openshift-controller-manager-operator-756b6f6bc6-nscjl\" (UID: \"07f01852-61b7-4eee-acd6-3d8b8e2b1c85\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nscjl" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812880 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812905 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07f01852-61b7-4eee-acd6-3d8b8e2b1c85-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-nscjl\" (UID: \"07f01852-61b7-4eee-acd6-3d8b8e2b1c85\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nscjl" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812934 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/90b6768c-8240-4fc1-a760-59d79a3c1c02-encryption-config\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812962 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-service-ca\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812988 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/aa4e660f-7816-4c20-b94c-5f9543d9cbed-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-mxvkb\" (UID: \"aa4e660f-7816-4c20-b94c-5f9543d9cbed\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mxvkb" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813017 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa2b643f-ce1f-45db-ba7f-31a5fc037650-config\") pod \"console-operator-58897d9998-68kgl\" (UID: \"aa2b643f-ce1f-45db-ba7f-31a5fc037650\") " pod="openshift-console-operator/console-operator-58897d9998-68kgl" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813040 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa2b643f-ce1f-45db-ba7f-31a5fc037650-serving-cert\") pod \"console-operator-58897d9998-68kgl\" (UID: \"aa2b643f-ce1f-45db-ba7f-31a5fc037650\") " pod="openshift-console-operator/console-operator-58897d9998-68kgl" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813064 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813091 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90b6768c-8240-4fc1-a760-59d79a3c1c02-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813117 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61315eef-fa85-4828-9668-f6f4b1484453-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-p6shd\" (UID: \"61315eef-fa85-4828-9668-f6f4b1484453\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p6shd" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813141 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s999\" (UniqueName: \"kubernetes.io/projected/61315eef-fa85-4828-9668-f6f4b1484453-kube-api-access-6s999\") pod \"openshift-apiserver-operator-796bbdcf4f-p6shd\" (UID: \"61315eef-fa85-4828-9668-f6f4b1484453\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p6shd" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813168 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/90b6768c-8240-4fc1-a760-59d79a3c1c02-etcd-client\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813194 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8bf2\" (UniqueName: \"kubernetes.io/projected/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-kube-api-access-p8bf2\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813222 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt92v\" (UniqueName: \"kubernetes.io/projected/1a75d5de-a507-41ca-8206-eae702d16020-kube-api-access-jt92v\") pod \"machine-api-operator-5694c8668f-h9ckd\" (UID: \"1a75d5de-a507-41ca-8206-eae702d16020\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h9ckd" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813250 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a47e9b4-6318-4f71-9db0-105be2ada134-serving-cert\") pod \"authentication-operator-69f744f599-92xp4\" (UID: \"4a47e9b4-6318-4f71-9db0-105be2ada134\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-92xp4" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813281 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813310 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813341 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a47e9b4-6318-4f71-9db0-105be2ada134-config\") pod \"authentication-operator-69f744f599-92xp4\" (UID: \"4a47e9b4-6318-4f71-9db0-105be2ada134\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-92xp4" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813370 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a47e9b4-6318-4f71-9db0-105be2ada134-service-ca-bundle\") pod \"authentication-operator-69f744f599-92xp4\" (UID: \"4a47e9b4-6318-4f71-9db0-105be2ada134\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-92xp4" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813394 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppbl4\" (UniqueName: \"kubernetes.io/projected/90b6768c-8240-4fc1-a760-59d79a3c1c02-kube-api-access-ppbl4\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813424 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4bdr\" (UniqueName: \"kubernetes.io/projected/120bd3b2-5437-4a15-bcc4-32ae06eb7f1f-kube-api-access-m4bdr\") pod \"downloads-7954f5f757-wmxb9\" (UID: \"120bd3b2-5437-4a15-bcc4-32ae06eb7f1f\") " pod="openshift-console/downloads-7954f5f757-wmxb9" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813498 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ac15d591-5558-4df9-b596-a1e27325bd6c-console-oauth-config\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813561 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a75d5de-a507-41ca-8206-eae702d16020-config\") pod \"machine-api-operator-5694c8668f-h9ckd\" (UID: \"1a75d5de-a507-41ca-8206-eae702d16020\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h9ckd" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813601 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-audit-policies\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813640 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/90b6768c-8240-4fc1-a760-59d79a3c1c02-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813693 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317-machine-approver-tls\") pod \"machine-approver-56656f9798-cnk8x\" (UID: \"b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cnk8x" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813725 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813756 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813779 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-config\") pod \"controller-manager-879f6c89f-9gr25\" (UID: \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813811 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj8k9\" (UniqueName: \"kubernetes.io/projected/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-kube-api-access-jj8k9\") pod \"controller-manager-879f6c89f-9gr25\" (UID: \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813842 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07f01852-61b7-4eee-acd6-3d8b8e2b1c85-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-nscjl\" (UID: \"07f01852-61b7-4eee-acd6-3d8b8e2b1c85\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nscjl" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813869 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813928 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vsbm\" (UniqueName: \"kubernetes.io/projected/aa2b643f-ce1f-45db-ba7f-31a5fc037650-kube-api-access-8vsbm\") pod \"console-operator-58897d9998-68kgl\" (UID: \"aa2b643f-ce1f-45db-ba7f-31a5fc037650\") " pod="openshift-console-operator/console-operator-58897d9998-68kgl" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:55.368276 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.818302 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.818857 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.811272 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.811290 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813381 4720 request.go:700] Waited for 1.482992398s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns-operator/secrets?fieldSelector=metadata.name%3Dmetrics-tls&limit=500&resourceVersion=0 Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813815 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813878 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.838157 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.838334 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x48m6" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.838508 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.838754 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.839082 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.840503 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.841109 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.841326 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.841736 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.842213 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.843936 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.844222 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.845040 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.845155 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.845232 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.845352 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.845524 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.845723 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.845903 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.846035 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.846361 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.846542 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.847526 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.848807 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.849566 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p6shd"] Jan 21 14:29:56 crc kubenswrapper[4720]: E0121 14:29:56.850020 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:12.849987623 +0000 UTC m=+50.758727565 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.851331 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.851830 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.852170 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.852354 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.852586 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.854911 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.855686 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.856064 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.856377 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.857083 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.857641 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.859689 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7xcc8"] Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.861192 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.863628 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.864081 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.864258 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.864755 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.865172 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.865447 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.871782 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.872110 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-68kgl"] Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.872372 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-wmxb9"] Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.874290 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-42g76"] Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.885198 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk"] Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.888689 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pm8dm"] Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.889195 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.889632 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.894427 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.895784 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mxvkb"] Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.896981 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-zvq7p"] Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.898526 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nscjl"] Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.899400 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-v2pht"] Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.901791 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.922993 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-console-config\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923053 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-audit-dir\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923086 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90b6768c-8240-4fc1-a760-59d79a3c1c02-serving-cert\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923119 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s95x9\" (UniqueName: \"kubernetes.io/projected/b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317-kube-api-access-s95x9\") pod \"machine-approver-56656f9798-cnk8x\" (UID: \"b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cnk8x" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923152 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/afb1ffca-e30f-47cf-b399-2bd057039b10-available-featuregates\") pod \"openshift-config-operator-7777fb866f-v2pht\" (UID: \"afb1ffca-e30f-47cf-b399-2bd057039b10\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-v2pht" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923189 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/0f685084-f748-4a34-9020-4d562f2a6d45-image-import-ca\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923195 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-audit-dir\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923223 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923276 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0f685084-f748-4a34-9020-4d562f2a6d45-node-pullsecrets\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923324 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-config\") pod \"route-controller-manager-6576b87f9c-bncrk\" (UID: \"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923353 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac15d591-5558-4df9-b596-a1e27325bd6c-console-serving-cert\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923377 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-serving-cert\") pod \"controller-manager-879f6c89f-9gr25\" (UID: \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923408 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0f685084-f748-4a34-9020-4d562f2a6d45-encryption-config\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923440 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0f685084-f748-4a34-9020-4d562f2a6d45-etcd-client\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923469 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxzpl\" (UniqueName: \"kubernetes.io/projected/07f01852-61b7-4eee-acd6-3d8b8e2b1c85-kube-api-access-wxzpl\") pod \"openshift-controller-manager-operator-756b6f6bc6-nscjl\" (UID: \"07f01852-61b7-4eee-acd6-3d8b8e2b1c85\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nscjl" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923506 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923537 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07f01852-61b7-4eee-acd6-3d8b8e2b1c85-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-nscjl\" (UID: \"07f01852-61b7-4eee-acd6-3d8b8e2b1c85\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nscjl" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923559 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/90b6768c-8240-4fc1-a760-59d79a3c1c02-encryption-config\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923586 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0f685084-f748-4a34-9020-4d562f2a6d45-audit-dir\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923618 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-service-ca\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923645 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/aa4e660f-7816-4c20-b94c-5f9543d9cbed-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-mxvkb\" (UID: \"aa4e660f-7816-4c20-b94c-5f9543d9cbed\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mxvkb" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923705 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa2b643f-ce1f-45db-ba7f-31a5fc037650-config\") pod \"console-operator-58897d9998-68kgl\" (UID: \"aa2b643f-ce1f-45db-ba7f-31a5fc037650\") " pod="openshift-console-operator/console-operator-58897d9998-68kgl" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923733 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa2b643f-ce1f-45db-ba7f-31a5fc037650-serving-cert\") pod \"console-operator-58897d9998-68kgl\" (UID: \"aa2b643f-ce1f-45db-ba7f-31a5fc037650\") " pod="openshift-console-operator/console-operator-58897d9998-68kgl" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923756 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923792 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61315eef-fa85-4828-9668-f6f4b1484453-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-p6shd\" (UID: \"61315eef-fa85-4828-9668-f6f4b1484453\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p6shd" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923811 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s999\" (UniqueName: \"kubernetes.io/projected/61315eef-fa85-4828-9668-f6f4b1484453-kube-api-access-6s999\") pod \"openshift-apiserver-operator-796bbdcf4f-p6shd\" (UID: \"61315eef-fa85-4828-9668-f6f4b1484453\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p6shd" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923825 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90b6768c-8240-4fc1-a760-59d79a3c1c02-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923847 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/90b6768c-8240-4fc1-a760-59d79a3c1c02-etcd-client\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923864 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0f685084-f748-4a34-9020-4d562f2a6d45-audit\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923885 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8bf2\" (UniqueName: \"kubernetes.io/projected/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-kube-api-access-p8bf2\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923903 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt92v\" (UniqueName: \"kubernetes.io/projected/1a75d5de-a507-41ca-8206-eae702d16020-kube-api-access-jt92v\") pod \"machine-api-operator-5694c8668f-h9ckd\" (UID: \"1a75d5de-a507-41ca-8206-eae702d16020\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h9ckd" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923921 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a47e9b4-6318-4f71-9db0-105be2ada134-serving-cert\") pod \"authentication-operator-69f744f599-92xp4\" (UID: \"4a47e9b4-6318-4f71-9db0-105be2ada134\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-92xp4" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923940 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923961 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a47e9b4-6318-4f71-9db0-105be2ada134-config\") pod \"authentication-operator-69f744f599-92xp4\" (UID: \"4a47e9b4-6318-4f71-9db0-105be2ada134\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-92xp4" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923980 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923996 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a47e9b4-6318-4f71-9db0-105be2ada134-service-ca-bundle\") pod \"authentication-operator-69f744f599-92xp4\" (UID: \"4a47e9b4-6318-4f71-9db0-105be2ada134\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-92xp4" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924013 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppbl4\" (UniqueName: \"kubernetes.io/projected/90b6768c-8240-4fc1-a760-59d79a3c1c02-kube-api-access-ppbl4\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924034 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f685084-f748-4a34-9020-4d562f2a6d45-serving-cert\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924057 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4bdr\" (UniqueName: \"kubernetes.io/projected/120bd3b2-5437-4a15-bcc4-32ae06eb7f1f-kube-api-access-m4bdr\") pod \"downloads-7954f5f757-wmxb9\" (UID: \"120bd3b2-5437-4a15-bcc4-32ae06eb7f1f\") " pod="openshift-console/downloads-7954f5f757-wmxb9" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924073 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ac15d591-5558-4df9-b596-a1e27325bd6c-console-oauth-config\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924092 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f685084-f748-4a34-9020-4d562f2a6d45-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924114 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a75d5de-a507-41ca-8206-eae702d16020-config\") pod \"machine-api-operator-5694c8668f-h9ckd\" (UID: \"1a75d5de-a507-41ca-8206-eae702d16020\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h9ckd" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924132 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f685084-f748-4a34-9020-4d562f2a6d45-config\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924152 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-audit-policies\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924169 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/90b6768c-8240-4fc1-a760-59d79a3c1c02-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924189 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317-machine-approver-tls\") pod \"machine-approver-56656f9798-cnk8x\" (UID: \"b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cnk8x" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924211 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924230 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924246 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-config\") pod \"controller-manager-879f6c89f-9gr25\" (UID: \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924264 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj8k9\" (UniqueName: \"kubernetes.io/projected/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-kube-api-access-jj8k9\") pod \"controller-manager-879f6c89f-9gr25\" (UID: \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924285 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9a655c79-a709-4d61-8209-200b86144e8b-metrics-tls\") pod \"dns-operator-744455d44c-zvq7p\" (UID: \"9a655c79-a709-4d61-8209-200b86144e8b\") " pod="openshift-dns-operator/dns-operator-744455d44c-zvq7p" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924306 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07f01852-61b7-4eee-acd6-3d8b8e2b1c85-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-nscjl\" (UID: \"07f01852-61b7-4eee-acd6-3d8b8e2b1c85\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nscjl" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924324 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924342 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vsbm\" (UniqueName: \"kubernetes.io/projected/aa2b643f-ce1f-45db-ba7f-31a5fc037650-kube-api-access-8vsbm\") pod \"console-operator-58897d9998-68kgl\" (UID: \"aa2b643f-ce1f-45db-ba7f-31a5fc037650\") " pod="openshift-console-operator/console-operator-58897d9998-68kgl" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924361 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0f685084-f748-4a34-9020-4d562f2a6d45-etcd-serving-ca\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924380 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-client-ca\") pod \"controller-manager-879f6c89f-9gr25\" (UID: \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924398 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/90b6768c-8240-4fc1-a760-59d79a3c1c02-audit-dir\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924414 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1a75d5de-a507-41ca-8206-eae702d16020-images\") pod \"machine-api-operator-5694c8668f-h9ckd\" (UID: \"1a75d5de-a507-41ca-8206-eae702d16020\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h9ckd" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924433 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-trusted-ca-bundle\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924452 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317-config\") pod \"machine-approver-56656f9798-cnk8x\" (UID: \"b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cnk8x" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924470 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz7tz\" (UniqueName: \"kubernetes.io/projected/9a655c79-a709-4d61-8209-200b86144e8b-kube-api-access-lz7tz\") pod \"dns-operator-744455d44c-zvq7p\" (UID: \"9a655c79-a709-4d61-8209-200b86144e8b\") " pod="openshift-dns-operator/dns-operator-744455d44c-zvq7p" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924491 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m254w\" (UniqueName: \"kubernetes.io/projected/0f685084-f748-4a34-9020-4d562f2a6d45-kube-api-access-m254w\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924510 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54pb6\" (UniqueName: \"kubernetes.io/projected/afb1ffca-e30f-47cf-b399-2bd057039b10-kube-api-access-54pb6\") pod \"openshift-config-operator-7777fb866f-v2pht\" (UID: \"afb1ffca-e30f-47cf-b399-2bd057039b10\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-v2pht" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924529 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924584 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924606 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/90b6768c-8240-4fc1-a760-59d79a3c1c02-audit-policies\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924623 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317-auth-proxy-config\") pod \"machine-approver-56656f9798-cnk8x\" (UID: \"b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cnk8x" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924648 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k4hm\" (UniqueName: \"kubernetes.io/projected/aa4e660f-7816-4c20-b94c-5f9543d9cbed-kube-api-access-5k4hm\") pod \"cluster-samples-operator-665b6dd947-mxvkb\" (UID: \"aa4e660f-7816-4c20-b94c-5f9543d9cbed\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mxvkb" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924683 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-serving-cert\") pod \"route-controller-manager-6576b87f9c-bncrk\" (UID: \"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924718 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aa2b643f-ce1f-45db-ba7f-31a5fc037650-trusted-ca\") pod \"console-operator-58897d9998-68kgl\" (UID: \"aa2b643f-ce1f-45db-ba7f-31a5fc037650\") " pod="openshift-console-operator/console-operator-58897d9998-68kgl" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924740 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8848\" (UniqueName: \"kubernetes.io/projected/4a47e9b4-6318-4f71-9db0-105be2ada134-kube-api-access-j8848\") pod \"authentication-operator-69f744f599-92xp4\" (UID: \"4a47e9b4-6318-4f71-9db0-105be2ada134\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-92xp4" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924764 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-client-ca\") pod \"route-controller-manager-6576b87f9c-bncrk\" (UID: \"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924787 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1a75d5de-a507-41ca-8206-eae702d16020-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-h9ckd\" (UID: \"1a75d5de-a507-41ca-8206-eae702d16020\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h9ckd" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924808 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a47e9b4-6318-4f71-9db0-105be2ada134-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-92xp4\" (UID: \"4a47e9b4-6318-4f71-9db0-105be2ada134\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-92xp4" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924829 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tf9q\" (UniqueName: \"kubernetes.io/projected/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-kube-api-access-6tf9q\") pod \"route-controller-manager-6576b87f9c-bncrk\" (UID: \"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924838 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-njjgs"] Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.925571 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-oauth-serving-cert\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.925734 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gjtkx"] Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.926105 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-console-config\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.926221 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gjtkx" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.927075 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/afb1ffca-e30f-47cf-b399-2bd057039b10-available-featuregates\") pod \"openshift-config-operator-7777fb866f-v2pht\" (UID: \"afb1ffca-e30f-47cf-b399-2bd057039b10\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-v2pht" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.927539 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90b6768c-8240-4fc1-a760-59d79a3c1c02-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924848 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-oauth-serving-cert\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.927894 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9gr25\" (UID: \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.928828 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afb1ffca-e30f-47cf-b399-2bd057039b10-serving-cert\") pod \"openshift-config-operator-7777fb866f-v2pht\" (UID: \"afb1ffca-e30f-47cf-b399-2bd057039b10\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-v2pht" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.928875 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzt2f\" (UniqueName: \"kubernetes.io/projected/ac15d591-5558-4df9-b596-a1e27325bd6c-kube-api-access-nzt2f\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.928911 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.928942 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61315eef-fa85-4828-9668-f6f4b1484453-config\") pod \"openshift-apiserver-operator-796bbdcf4f-p6shd\" (UID: \"61315eef-fa85-4828-9668-f6f4b1484453\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p6shd" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.929185 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-audit-policies\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.929798 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9gr25\" (UID: \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.929891 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-njjgs" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.930114 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/90b6768c-8240-4fc1-a760-59d79a3c1c02-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.941103 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.941696 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317-machine-approver-tls\") pod \"machine-approver-56656f9798-cnk8x\" (UID: \"b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cnk8x" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.942883 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90b6768c-8240-4fc1-a760-59d79a3c1c02-serving-cert\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.943144 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6kjwf"] Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.945046 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61315eef-fa85-4828-9668-f6f4b1484453-config\") pod \"openshift-apiserver-operator-796bbdcf4f-p6shd\" (UID: \"61315eef-fa85-4828-9668-f6f4b1484453\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p6shd" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.928771 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-config\") pod \"route-controller-manager-6576b87f9c-bncrk\" (UID: \"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.950777 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.951250 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07f01852-61b7-4eee-acd6-3d8b8e2b1c85-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-nscjl\" (UID: \"07f01852-61b7-4eee-acd6-3d8b8e2b1c85\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nscjl" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.951790 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afb1ffca-e30f-47cf-b399-2bd057039b10-serving-cert\") pod \"openshift-config-operator-7777fb866f-v2pht\" (UID: \"afb1ffca-e30f-47cf-b399-2bd057039b10\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-v2pht" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.951913 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.953497 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.954784 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-config\") pod \"controller-manager-879f6c89f-9gr25\" (UID: \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.955046 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac15d591-5558-4df9-b596-a1e27325bd6c-console-serving-cert\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.955759 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.956728 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lhd59"] Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.962689 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-5qcz5"] Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.963273 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkmbd"] Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.964129 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a47e9b4-6318-4f71-9db0-105be2ada134-service-ca-bundle\") pod \"authentication-operator-69f744f599-92xp4\" (UID: \"4a47e9b4-6318-4f71-9db0-105be2ada134\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-92xp4" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.960732 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a47e9b4-6318-4f71-9db0-105be2ada134-config\") pod \"authentication-operator-69f744f599-92xp4\" (UID: \"4a47e9b4-6318-4f71-9db0-105be2ada134\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-92xp4" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.960162 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a47e9b4-6318-4f71-9db0-105be2ada134-serving-cert\") pod \"authentication-operator-69f744f599-92xp4\" (UID: \"4a47e9b4-6318-4f71-9db0-105be2ada134\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-92xp4" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.955913 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.956832 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.973457 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gnrtf"] Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.973747 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-client-ca\") pod \"controller-manager-879f6c89f-9gr25\" (UID: \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.958429 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/90b6768c-8240-4fc1-a760-59d79a3c1c02-etcd-client\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.973837 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/90b6768c-8240-4fc1-a760-59d79a3c1c02-audit-dir\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.973984 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.974419 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1a75d5de-a507-41ca-8206-eae702d16020-images\") pod \"machine-api-operator-5694c8668f-h9ckd\" (UID: \"1a75d5de-a507-41ca-8206-eae702d16020\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h9ckd" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.956125 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.975756 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317-config\") pod \"machine-approver-56656f9798-cnk8x\" (UID: \"b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cnk8x" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.974006 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gnrtf" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.976147 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-trusted-ca-bundle\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.976636 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.978286 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-service-ca\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.980022 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-serving-cert\") pod \"controller-manager-879f6c89f-9gr25\" (UID: \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.981318 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/90b6768c-8240-4fc1-a760-59d79a3c1c02-audit-policies\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.981787 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317-auth-proxy-config\") pod \"machine-approver-56656f9798-cnk8x\" (UID: \"b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cnk8x" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.956206 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.956807 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.956850 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.957312 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.983507 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxq5z"] Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.983807 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkmbd" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.973475 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lhd59" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.984943 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.004190 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzt2f\" (UniqueName: \"kubernetes.io/projected/ac15d591-5558-4df9-b596-a1e27325bd6c-kube-api-access-nzt2f\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.004502 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.005747 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.005810 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.005834 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.006038 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.006708 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1a75d5de-a507-41ca-8206-eae702d16020-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-h9ckd\" (UID: \"1a75d5de-a507-41ca-8206-eae702d16020\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h9ckd" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.007283 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.007509 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.007747 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aa2b643f-ce1f-45db-ba7f-31a5fc037650-trusted-ca\") pod \"console-operator-58897d9998-68kgl\" (UID: \"aa2b643f-ce1f-45db-ba7f-31a5fc037650\") " pod="openshift-console-operator/console-operator-58897d9998-68kgl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.008623 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt92v\" (UniqueName: \"kubernetes.io/projected/1a75d5de-a507-41ca-8206-eae702d16020-kube-api-access-jt92v\") pod \"machine-api-operator-5694c8668f-h9ckd\" (UID: \"1a75d5de-a507-41ca-8206-eae702d16020\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h9ckd" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.008624 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppbl4\" (UniqueName: \"kubernetes.io/projected/90b6768c-8240-4fc1-a760-59d79a3c1c02-kube-api-access-ppbl4\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.008747 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.009010 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.009137 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.009553 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.011770 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.012346 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.013116 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jtj6g"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.013478 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-d7wmg"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.013852 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-zxv2h"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.014203 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm55"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.014617 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7mfnf"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.015129 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vxdw2"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.015254 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.015390 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.015436 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.015538 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.015646 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.015759 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.015990 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jtj6g" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.015257 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k4hm\" (UniqueName: \"kubernetes.io/projected/aa4e660f-7816-4c20-b94c-5f9543d9cbed-kube-api-access-5k4hm\") pod \"cluster-samples-operator-665b6dd947-mxvkb\" (UID: \"aa4e660f-7816-4c20-b94c-5f9543d9cbed\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mxvkb" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.016459 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxzpl\" (UniqueName: \"kubernetes.io/projected/07f01852-61b7-4eee-acd6-3d8b8e2b1c85-kube-api-access-wxzpl\") pod \"openshift-controller-manager-operator-756b6f6bc6-nscjl\" (UID: \"07f01852-61b7-4eee-acd6-3d8b8e2b1c85\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nscjl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.016545 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm55" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.016561 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxq5z" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.016487 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-d7wmg" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.016765 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-7mfnf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.016516 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zxv2h" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.016918 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.017340 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.017505 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.017727 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a47e9b4-6318-4f71-9db0-105be2ada134-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-92xp4\" (UID: \"4a47e9b4-6318-4f71-9db0-105be2ada134\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-92xp4" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.018221 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.018877 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj8k9\" (UniqueName: \"kubernetes.io/projected/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-kube-api-access-jj8k9\") pod \"controller-manager-879f6c89f-9gr25\" (UID: \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.019676 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.019676 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.019826 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.019938 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.020076 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.020220 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.020615 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.020783 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.020947 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.021052 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.021186 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.021467 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.022277 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.023386 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-lrm9f"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.024236 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.024738 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa2b643f-ce1f-45db-ba7f-31a5fc037650-config\") pod \"console-operator-58897d9998-68kgl\" (UID: \"aa2b643f-ce1f-45db-ba7f-31a5fc037650\") " pod="openshift-console-operator/console-operator-58897d9998-68kgl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.025250 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-client-ca\") pod \"route-controller-manager-6576b87f9c-bncrk\" (UID: \"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.027390 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.027707 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.027728 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.027967 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.028122 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.028409 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5gg5l"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.029102 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5gg5l" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.030627 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa2b643f-ce1f-45db-ba7f-31a5fc037650-serving-cert\") pod \"console-operator-58897d9998-68kgl\" (UID: \"aa2b643f-ce1f-45db-ba7f-31a5fc037650\") " pod="openshift-console-operator/console-operator-58897d9998-68kgl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.030809 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.031203 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/0f685084-f748-4a34-9020-4d562f2a6d45-image-import-ca\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.031255 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ac33402e-edb9-41ab-bb76-b17108b5ea0d-srv-cert\") pod \"catalog-operator-68c6474976-qkmbd\" (UID: \"ac33402e-edb9-41ab-bb76-b17108b5ea0d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkmbd" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.031273 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bdfefc7f-6e59-460a-be36-220a37dd02d1-metrics-tls\") pod \"dns-default-njjgs\" (UID: \"bdfefc7f-6e59-460a-be36-220a37dd02d1\") " pod="openshift-dns/dns-default-njjgs" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.031288 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4105ff8-7f4b-4dc9-8ceb-3d136236e8fb-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-gnrtf\" (UID: \"d4105ff8-7f4b-4dc9-8ceb-3d136236e8fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gnrtf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.031328 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0f685084-f748-4a34-9020-4d562f2a6d45-node-pullsecrets\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.031344 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvfxs\" (UniqueName: \"kubernetes.io/projected/bdfefc7f-6e59-460a-be36-220a37dd02d1-kube-api-access-xvfxs\") pod \"dns-default-njjgs\" (UID: \"bdfefc7f-6e59-460a-be36-220a37dd02d1\") " pod="openshift-dns/dns-default-njjgs" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.031361 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0f685084-f748-4a34-9020-4d562f2a6d45-encryption-config\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.031394 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28j2h\" (UniqueName: \"kubernetes.io/projected/f55572f9-fbba-4efa-a6a8-94884f06f9c3-kube-api-access-28j2h\") pod \"router-default-5444994796-5qcz5\" (UID: \"f55572f9-fbba-4efa-a6a8-94884f06f9c3\") " pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.031412 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0f685084-f748-4a34-9020-4d562f2a6d45-etcd-client\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.031465 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0f685084-f748-4a34-9020-4d562f2a6d45-audit-dir\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.031487 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.031504 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/139e5d2d-e00c-4dc9-a6d9-2edc4a09e5e6-config\") pod \"kube-controller-manager-operator-78b949d7b-lhd59\" (UID: \"139e5d2d-e00c-4dc9-a6d9-2edc4a09e5e6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lhd59" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.031540 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/139e5d2d-e00c-4dc9-a6d9-2edc4a09e5e6-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lhd59\" (UID: \"139e5d2d-e00c-4dc9-a6d9-2edc4a09e5e6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lhd59" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.031761 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a75d5de-a507-41ca-8206-eae702d16020-config\") pod \"machine-api-operator-5694c8668f-h9ckd\" (UID: \"1a75d5de-a507-41ca-8206-eae702d16020\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h9ckd" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.032011 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.032530 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vsbm\" (UniqueName: \"kubernetes.io/projected/aa2b643f-ce1f-45db-ba7f-31a5fc037650-kube-api-access-8vsbm\") pod \"console-operator-58897d9998-68kgl\" (UID: \"aa2b643f-ce1f-45db-ba7f-31a5fc037650\") " pod="openshift-console-operator/console-operator-58897d9998-68kgl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.032570 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-tx54b"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.032588 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/0f685084-f748-4a34-9020-4d562f2a6d45-image-import-ca\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.032750 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/aa4e660f-7816-4c20-b94c-5f9543d9cbed-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-mxvkb\" (UID: \"aa4e660f-7816-4c20-b94c-5f9543d9cbed\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mxvkb" Jan 21 14:29:57 crc kubenswrapper[4720]: E0121 14:29:57.032930 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:29:57.532917923 +0000 UTC m=+35.441657845 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.033070 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.033398 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/90b6768c-8240-4fc1-a760-59d79a3c1c02-encryption-config\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.033650 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7vkvw"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.034014 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s95x9\" (UniqueName: \"kubernetes.io/projected/b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317-kube-api-access-s95x9\") pod \"machine-approver-56656f9798-cnk8x\" (UID: \"b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cnk8x" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.034113 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-7vkvw" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.034425 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-tx54b" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.034642 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ac15d591-5558-4df9-b596-a1e27325bd6c-console-oauth-config\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.035082 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.035364 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0f685084-f748-4a34-9020-4d562f2a6d45-audit\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.035394 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ccf13312-4caa-4898-9dd3-3f9614ecee01-registry-certificates\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.035413 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jcfd\" (UniqueName: \"kubernetes.io/projected/ccf13312-4caa-4898-9dd3-3f9614ecee01-kube-api-access-7jcfd\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.035621 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07f01852-61b7-4eee-acd6-3d8b8e2b1c85-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-nscjl\" (UID: \"07f01852-61b7-4eee-acd6-3d8b8e2b1c85\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nscjl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.035752 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.035785 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0f685084-f748-4a34-9020-4d562f2a6d45-audit-dir\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.035842 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0f685084-f748-4a34-9020-4d562f2a6d45-audit\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.035893 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f55572f9-fbba-4efa-a6a8-94884f06f9c3-default-certificate\") pod \"router-default-5444994796-5qcz5\" (UID: \"f55572f9-fbba-4efa-a6a8-94884f06f9c3\") " pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.040512 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4105ff8-7f4b-4dc9-8ceb-3d136236e8fb-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-gnrtf\" (UID: \"d4105ff8-7f4b-4dc9-8ceb-3d136236e8fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gnrtf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.040542 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29766114-9e0b-4064-8010-8f426935f834-config\") pod \"kube-apiserver-operator-766d6c64bb-gjtkx\" (UID: \"29766114-9e0b-4064-8010-8f426935f834\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gjtkx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.040561 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f55572f9-fbba-4efa-a6a8-94884f06f9c3-stats-auth\") pod \"router-default-5444994796-5qcz5\" (UID: \"f55572f9-fbba-4efa-a6a8-94884f06f9c3\") " pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.040577 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ccf13312-4caa-4898-9dd3-3f9614ecee01-registry-tls\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.040593 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ccf13312-4caa-4898-9dd3-3f9614ecee01-trusted-ca\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.040615 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f685084-f748-4a34-9020-4d562f2a6d45-serving-cert\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.040632 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ac33402e-edb9-41ab-bb76-b17108b5ea0d-profile-collector-cert\") pod \"catalog-operator-68c6474976-qkmbd\" (UID: \"ac33402e-edb9-41ab-bb76-b17108b5ea0d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkmbd" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.040673 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ccf13312-4caa-4898-9dd3-3f9614ecee01-bound-sa-token\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.040703 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f685084-f748-4a34-9020-4d562f2a6d45-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.040721 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29766114-9e0b-4064-8010-8f426935f834-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-gjtkx\" (UID: \"29766114-9e0b-4064-8010-8f426935f834\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gjtkx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.040740 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f685084-f748-4a34-9020-4d562f2a6d45-config\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.040763 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f55572f9-fbba-4efa-a6a8-94884f06f9c3-service-ca-bundle\") pod \"router-default-5444994796-5qcz5\" (UID: \"f55572f9-fbba-4efa-a6a8-94884f06f9c3\") " pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.040786 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9a655c79-a709-4d61-8209-200b86144e8b-metrics-tls\") pod \"dns-operator-744455d44c-zvq7p\" (UID: \"9a655c79-a709-4d61-8209-200b86144e8b\") " pod="openshift-dns-operator/dns-operator-744455d44c-zvq7p" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.040804 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q4rp\" (UniqueName: \"kubernetes.io/projected/ac33402e-edb9-41ab-bb76-b17108b5ea0d-kube-api-access-2q4rp\") pod \"catalog-operator-68c6474976-qkmbd\" (UID: \"ac33402e-edb9-41ab-bb76-b17108b5ea0d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkmbd" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.038229 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54pb6\" (UniqueName: \"kubernetes.io/projected/afb1ffca-e30f-47cf-b399-2bd057039b10-kube-api-access-54pb6\") pod \"openshift-config-operator-7777fb866f-v2pht\" (UID: \"afb1ffca-e30f-47cf-b399-2bd057039b10\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-v2pht" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.040823 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0f685084-f748-4a34-9020-4d562f2a6d45-etcd-serving-ca\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.040842 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29766114-9e0b-4064-8010-8f426935f834-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-gjtkx\" (UID: \"29766114-9e0b-4064-8010-8f426935f834\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gjtkx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.040922 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz7tz\" (UniqueName: \"kubernetes.io/projected/9a655c79-a709-4d61-8209-200b86144e8b-kube-api-access-lz7tz\") pod \"dns-operator-744455d44c-zvq7p\" (UID: \"9a655c79-a709-4d61-8209-200b86144e8b\") " pod="openshift-dns-operator/dns-operator-744455d44c-zvq7p" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.040949 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ccf13312-4caa-4898-9dd3-3f9614ecee01-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.040972 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m254w\" (UniqueName: \"kubernetes.io/projected/0f685084-f748-4a34-9020-4d562f2a6d45-kube-api-access-m254w\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.041003 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f55572f9-fbba-4efa-a6a8-94884f06f9c3-metrics-certs\") pod \"router-default-5444994796-5qcz5\" (UID: \"f55572f9-fbba-4efa-a6a8-94884f06f9c3\") " pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.041049 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ccf13312-4caa-4898-9dd3-3f9614ecee01-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.041069 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bdfefc7f-6e59-460a-be36-220a37dd02d1-config-volume\") pod \"dns-default-njjgs\" (UID: \"bdfefc7f-6e59-460a-be36-220a37dd02d1\") " pod="openshift-dns/dns-default-njjgs" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.041090 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/139e5d2d-e00c-4dc9-a6d9-2edc4a09e5e6-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lhd59\" (UID: \"139e5d2d-e00c-4dc9-a6d9-2edc4a09e5e6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lhd59" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.041113 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fqt6\" (UniqueName: \"kubernetes.io/projected/d4105ff8-7f4b-4dc9-8ceb-3d136236e8fb-kube-api-access-6fqt6\") pod \"kube-storage-version-migrator-operator-b67b599dd-gnrtf\" (UID: \"d4105ff8-7f4b-4dc9-8ceb-3d136236e8fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gnrtf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.035929 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0f685084-f748-4a34-9020-4d562f2a6d45-node-pullsecrets\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.035972 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.036218 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-s6zqk"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.042264 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.037682 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0f685084-f748-4a34-9020-4d562f2a6d45-encryption-config\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.037051 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4bdr\" (UniqueName: \"kubernetes.io/projected/120bd3b2-5437-4a15-bcc4-32ae06eb7f1f-kube-api-access-m4bdr\") pod \"downloads-7954f5f757-wmxb9\" (UID: \"120bd3b2-5437-4a15-bcc4-32ae06eb7f1f\") " pod="openshift-console/downloads-7954f5f757-wmxb9" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.042817 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.036328 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.048281 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-s6zqk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.038294 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0f685084-f748-4a34-9020-4d562f2a6d45-etcd-client\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.039410 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61315eef-fa85-4828-9668-f6f4b1484453-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-p6shd\" (UID: \"61315eef-fa85-4828-9668-f6f4b1484453\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p6shd" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.040030 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-serving-cert\") pod \"route-controller-manager-6576b87f9c-bncrk\" (UID: \"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.036341 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.049240 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8bf2\" (UniqueName: \"kubernetes.io/projected/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-kube-api-access-p8bf2\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.036367 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.036406 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.036418 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.036450 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.051886 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f685084-f748-4a34-9020-4d562f2a6d45-serving-cert\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.036556 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.036597 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.052515 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0f685084-f748-4a34-9020-4d562f2a6d45-etcd-serving-ca\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.052525 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f685084-f748-4a34-9020-4d562f2a6d45-config\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.036635 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.037405 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.038026 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.053686 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tf9q\" (UniqueName: \"kubernetes.io/projected/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-kube-api-access-6tf9q\") pod \"route-controller-manager-6576b87f9c-bncrk\" (UID: \"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.053814 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9a655c79-a709-4d61-8209-200b86144e8b-metrics-tls\") pod \"dns-operator-744455d44c-zvq7p\" (UID: \"9a655c79-a709-4d61-8209-200b86144e8b\") " pod="openshift-dns-operator/dns-operator-744455d44c-zvq7p" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.054171 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8848\" (UniqueName: \"kubernetes.io/projected/4a47e9b4-6318-4f71-9db0-105be2ada134-kube-api-access-j8848\") pod \"authentication-operator-69f744f599-92xp4\" (UID: \"4a47e9b4-6318-4f71-9db0-105be2ada134\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-92xp4" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.055413 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.055446 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-swblx"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.055460 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.057169 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-swblx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.058602 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-nwj8k"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.060402 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.061052 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f685084-f748-4a34-9020-4d562f2a6d45-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.061085 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-n9vh6"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.065143 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-j577t"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.065278 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-n9vh6" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.066178 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-j577t" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.067209 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-njjgs"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.067575 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.068482 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkmbd"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.069470 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6kjwf"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.070582 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.071816 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-d7wmg"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.072767 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gnrtf"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.075188 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vxdw2"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.076979 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.077208 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7mfnf"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.078402 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm55"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.079785 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.080789 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5gg5l"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.081938 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-zxv2h"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.082872 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gjtkx"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.083700 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxq5z"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.085887 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mxvkb" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.086149 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-j577t"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.087002 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lhd59"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.087831 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.087981 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.088577 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jtj6g"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.089633 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-s6zqk"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.090085 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-h9ckd" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.090492 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-lrm9f"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.095859 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-n9vh6"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.098907 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.099858 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7vkvw"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.101602 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-swblx"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.103825 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.107818 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.121551 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.121760 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.130415 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.130785 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cnk8x" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.139978 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.140977 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.142396 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.142665 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/90d203a9-910b-471c-afb5-e487b65136ac-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vxdw2\" (UID: \"90d203a9-910b-471c-afb5-e487b65136ac\") " pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.142700 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c00abc0-dc46-406c-8f2f-6904ac88126d-trusted-ca\") pod \"ingress-operator-5b745b69d9-rwv28\" (UID: \"5c00abc0-dc46-406c-8f2f-6904ac88126d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.142722 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/03e0f458-ccd0-429e-ae37-d4c1fd2946bf-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-8pm55\" (UID: \"03e0f458-ccd0-429e-ae37-d4c1fd2946bf\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm55" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.142742 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n4db\" (UniqueName: \"kubernetes.io/projected/cc0cfb45-abdd-434f-ae63-f6ae0fc7c092-kube-api-access-5n4db\") pod \"etcd-operator-b45778765-lrm9f\" (UID: \"cc0cfb45-abdd-434f-ae63-f6ae0fc7c092\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.142762 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f55572f9-fbba-4efa-a6a8-94884f06f9c3-stats-auth\") pod \"router-default-5444994796-5qcz5\" (UID: \"f55572f9-fbba-4efa-a6a8-94884f06f9c3\") " pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.142779 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ac33402e-edb9-41ab-bb76-b17108b5ea0d-profile-collector-cert\") pod \"catalog-operator-68c6474976-qkmbd\" (UID: \"ac33402e-edb9-41ab-bb76-b17108b5ea0d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkmbd" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.142795 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e675e6aa-6d61-4490-b768-1dbe664d1dfe-registration-dir\") pod \"csi-hostpathplugin-j577t\" (UID: \"e675e6aa-6d61-4490-b768-1dbe664d1dfe\") " pod="hostpath-provisioner/csi-hostpathplugin-j577t" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.142815 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29766114-9e0b-4064-8010-8f426935f834-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-gjtkx\" (UID: \"29766114-9e0b-4064-8010-8f426935f834\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gjtkx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.142831 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/61f96497-68d8-4347-b831-f7bc0204c677-signing-key\") pod \"service-ca-9c57cc56f-7vkvw\" (UID: \"61f96497-68d8-4347-b831-f7bc0204c677\") " pod="openshift-service-ca/service-ca-9c57cc56f-7vkvw" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.142879 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8ac39f2f-2411-4585-b15c-c473b2fdc077-proxy-tls\") pod \"machine-config-operator-74547568cd-n66gl\" (UID: \"8ac39f2f-2411-4585-b15c-c473b2fdc077\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.142898 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/48af697e-308a-4bdd-a5d8-d86cd5c4fb0c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jtj6g\" (UID: \"48af697e-308a-4bdd-a5d8-d86cd5c4fb0c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jtj6g" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.142917 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5c00abc0-dc46-406c-8f2f-6904ac88126d-metrics-tls\") pod \"ingress-operator-5b745b69d9-rwv28\" (UID: \"5c00abc0-dc46-406c-8f2f-6904ac88126d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.142937 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97vbj\" (UniqueName: \"kubernetes.io/projected/4e042627-4d69-4cc5-a00d-849fe4ce76f0-kube-api-access-97vbj\") pod \"migrator-59844c95c7-d7wmg\" (UID: \"4e042627-4d69-4cc5-a00d-849fe4ce76f0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-d7wmg" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.142978 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj5wb\" (UniqueName: \"kubernetes.io/projected/728ae7a4-9793-4555-abbb-b8a352700089-kube-api-access-lj5wb\") pod \"machine-config-server-tx54b\" (UID: \"728ae7a4-9793-4555-abbb-b8a352700089\") " pod="openshift-machine-config-operator/machine-config-server-tx54b" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.142998 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1796695a-873c-4c15-9351-9b5bc5607830-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-zxv2h\" (UID: \"1796695a-873c-4c15-9351-9b5bc5607830\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zxv2h" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143019 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q4rp\" (UniqueName: \"kubernetes.io/projected/ac33402e-edb9-41ab-bb76-b17108b5ea0d-kube-api-access-2q4rp\") pod \"catalog-operator-68c6474976-qkmbd\" (UID: \"ac33402e-edb9-41ab-bb76-b17108b5ea0d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkmbd" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143040 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fccce0ee-16e1-4237-8081-a6a3c93c5851-profile-collector-cert\") pod \"olm-operator-6b444d44fb-5gg5l\" (UID: \"fccce0ee-16e1-4237-8081-a6a3c93c5851\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5gg5l" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143063 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cdm8\" (UniqueName: \"kubernetes.io/projected/75c0e088-7bdf-47f4-b434-b184e742d40a-kube-api-access-2cdm8\") pod \"cni-sysctl-allowlist-ds-nwj8k\" (UID: \"75c0e088-7bdf-47f4-b434-b184e742d40a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143087 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29766114-9e0b-4064-8010-8f426935f834-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-gjtkx\" (UID: \"29766114-9e0b-4064-8010-8f426935f834\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gjtkx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143107 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1796695a-873c-4c15-9351-9b5bc5607830-proxy-tls\") pod \"machine-config-controller-84d6567774-zxv2h\" (UID: \"1796695a-873c-4c15-9351-9b5bc5607830\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zxv2h" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143124 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e675e6aa-6d61-4490-b768-1dbe664d1dfe-plugins-dir\") pod \"csi-hostpathplugin-j577t\" (UID: \"e675e6aa-6d61-4490-b768-1dbe664d1dfe\") " pod="hostpath-provisioner/csi-hostpathplugin-j577t" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143146 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/22087b1b-3ded-441f-8349-fb8f38809460-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-swblx\" (UID: \"22087b1b-3ded-441f-8349-fb8f38809460\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-swblx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143169 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7dq7\" (UniqueName: \"kubernetes.io/projected/90d203a9-910b-471c-afb5-e487b65136ac-kube-api-access-q7dq7\") pod \"marketplace-operator-79b997595-vxdw2\" (UID: \"90d203a9-910b-471c-afb5-e487b65136ac\") " pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143191 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e675e6aa-6d61-4490-b768-1dbe664d1dfe-mountpoint-dir\") pod \"csi-hostpathplugin-j577t\" (UID: \"e675e6aa-6d61-4490-b768-1dbe664d1dfe\") " pod="hostpath-provisioner/csi-hostpathplugin-j577t" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143216 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/92d3c944-8def-4f95-a3cb-781f929f5f28-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7mfnf\" (UID: \"92d3c944-8def-4f95-a3cb-781f929f5f28\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7mfnf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143233 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e675e6aa-6d61-4490-b768-1dbe664d1dfe-socket-dir\") pod \"csi-hostpathplugin-j577t\" (UID: \"e675e6aa-6d61-4490-b768-1dbe664d1dfe\") " pod="hostpath-provisioner/csi-hostpathplugin-j577t" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143255 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c868m\" (UniqueName: \"kubernetes.io/projected/e675e6aa-6d61-4490-b768-1dbe664d1dfe-kube-api-access-c868m\") pod \"csi-hostpathplugin-j577t\" (UID: \"e675e6aa-6d61-4490-b768-1dbe664d1dfe\") " pod="hostpath-provisioner/csi-hostpathplugin-j577t" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143290 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5c00abc0-dc46-406c-8f2f-6904ac88126d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rwv28\" (UID: \"5c00abc0-dc46-406c-8f2f-6904ac88126d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143307 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e675e6aa-6d61-4490-b768-1dbe664d1dfe-csi-data-dir\") pod \"csi-hostpathplugin-j577t\" (UID: \"e675e6aa-6d61-4490-b768-1dbe664d1dfe\") " pod="hostpath-provisioner/csi-hostpathplugin-j577t" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143331 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/75c0e088-7bdf-47f4-b434-b184e742d40a-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-nwj8k\" (UID: \"75c0e088-7bdf-47f4-b434-b184e742d40a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143349 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bdfefc7f-6e59-460a-be36-220a37dd02d1-config-volume\") pod \"dns-default-njjgs\" (UID: \"bdfefc7f-6e59-460a-be36-220a37dd02d1\") " pod="openshift-dns/dns-default-njjgs" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143363 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/75c0e088-7bdf-47f4-b434-b184e742d40a-ready\") pod \"cni-sysctl-allowlist-ds-nwj8k\" (UID: \"75c0e088-7bdf-47f4-b434-b184e742d40a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143397 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/139e5d2d-e00c-4dc9-a6d9-2edc4a09e5e6-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lhd59\" (UID: \"139e5d2d-e00c-4dc9-a6d9-2edc4a09e5e6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lhd59" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143423 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fqt6\" (UniqueName: \"kubernetes.io/projected/d4105ff8-7f4b-4dc9-8ceb-3d136236e8fb-kube-api-access-6fqt6\") pod \"kube-storage-version-migrator-operator-b67b599dd-gnrtf\" (UID: \"d4105ff8-7f4b-4dc9-8ceb-3d136236e8fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gnrtf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143445 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dddnq\" (UniqueName: \"kubernetes.io/projected/8ac39f2f-2411-4585-b15c-c473b2fdc077-kube-api-access-dddnq\") pod \"machine-config-operator-74547568cd-n66gl\" (UID: \"8ac39f2f-2411-4585-b15c-c473b2fdc077\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143468 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bdfefc7f-6e59-460a-be36-220a37dd02d1-metrics-tls\") pod \"dns-default-njjgs\" (UID: \"bdfefc7f-6e59-460a-be36-220a37dd02d1\") " pod="openshift-dns/dns-default-njjgs" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143491 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4105ff8-7f4b-4dc9-8ceb-3d136236e8fb-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-gnrtf\" (UID: \"d4105ff8-7f4b-4dc9-8ceb-3d136236e8fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gnrtf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143529 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28j2h\" (UniqueName: \"kubernetes.io/projected/f55572f9-fbba-4efa-a6a8-94884f06f9c3-kube-api-access-28j2h\") pod \"router-default-5444994796-5qcz5\" (UID: \"f55572f9-fbba-4efa-a6a8-94884f06f9c3\") " pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143552 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72vzw\" (UniqueName: \"kubernetes.io/projected/22087b1b-3ded-441f-8349-fb8f38809460-kube-api-access-72vzw\") pod \"cluster-image-registry-operator-dc59b4c8b-swblx\" (UID: \"22087b1b-3ded-441f-8349-fb8f38809460\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-swblx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143577 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd1cfb10-4405-4ab9-8631-690622069d01-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gxq5z\" (UID: \"fd1cfb10-4405-4ab9-8631-690622069d01\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxq5z" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143608 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/139e5d2d-e00c-4dc9-a6d9-2edc4a09e5e6-config\") pod \"kube-controller-manager-operator-78b949d7b-lhd59\" (UID: \"139e5d2d-e00c-4dc9-a6d9-2edc4a09e5e6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lhd59" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143643 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4fsv\" (UniqueName: \"kubernetes.io/projected/25067bcc-8503-442b-b348-87d7e1321dbd-kube-api-access-l4fsv\") pod \"packageserver-d55dfcdfc-86dvk\" (UID: \"25067bcc-8503-442b-b348-87d7e1321dbd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143699 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssqc7\" (UniqueName: \"kubernetes.io/projected/61f96497-68d8-4347-b831-f7bc0204c677-kube-api-access-ssqc7\") pod \"service-ca-9c57cc56f-7vkvw\" (UID: \"61f96497-68d8-4347-b831-f7bc0204c677\") " pod="openshift-service-ca/service-ca-9c57cc56f-7vkvw" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143756 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwqhl\" (UniqueName: \"kubernetes.io/projected/1796695a-873c-4c15-9351-9b5bc5607830-kube-api-access-zwqhl\") pod \"machine-config-controller-84d6567774-zxv2h\" (UID: \"1796695a-873c-4c15-9351-9b5bc5607830\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zxv2h" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143783 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f55572f9-fbba-4efa-a6a8-94884f06f9c3-default-certificate\") pod \"router-default-5444994796-5qcz5\" (UID: \"f55572f9-fbba-4efa-a6a8-94884f06f9c3\") " pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143800 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4105ff8-7f4b-4dc9-8ceb-3d136236e8fb-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-gnrtf\" (UID: \"d4105ff8-7f4b-4dc9-8ceb-3d136236e8fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gnrtf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143836 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29766114-9e0b-4064-8010-8f426935f834-config\") pod \"kube-apiserver-operator-766d6c64bb-gjtkx\" (UID: \"29766114-9e0b-4064-8010-8f426935f834\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gjtkx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143852 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ccf13312-4caa-4898-9dd3-3f9614ecee01-registry-tls\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143868 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ccf13312-4caa-4898-9dd3-3f9614ecee01-trusted-ca\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143884 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7xsv\" (UniqueName: \"kubernetes.io/projected/92d3c944-8def-4f95-a3cb-781f929f5f28-kube-api-access-t7xsv\") pod \"multus-admission-controller-857f4d67dd-7mfnf\" (UID: \"92d3c944-8def-4f95-a3cb-781f929f5f28\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7mfnf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143919 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ccf13312-4caa-4898-9dd3-3f9614ecee01-bound-sa-token\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143937 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8ac39f2f-2411-4585-b15c-c473b2fdc077-images\") pod \"machine-config-operator-74547568cd-n66gl\" (UID: \"8ac39f2f-2411-4585-b15c-c473b2fdc077\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143963 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f55572f9-fbba-4efa-a6a8-94884f06f9c3-service-ca-bundle\") pod \"router-default-5444994796-5qcz5\" (UID: \"f55572f9-fbba-4efa-a6a8-94884f06f9c3\") " pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143979 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/25067bcc-8503-442b-b348-87d7e1321dbd-apiservice-cert\") pod \"packageserver-d55dfcdfc-86dvk\" (UID: \"25067bcc-8503-442b-b348-87d7e1321dbd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144003 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg5ng\" (UniqueName: \"kubernetes.io/projected/fccce0ee-16e1-4237-8081-a6a3c93c5851-kube-api-access-tg5ng\") pod \"olm-operator-6b444d44fb-5gg5l\" (UID: \"fccce0ee-16e1-4237-8081-a6a3c93c5851\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5gg5l" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144038 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd1cfb10-4405-4ab9-8631-690622069d01-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gxq5z\" (UID: \"fd1cfb10-4405-4ab9-8631-690622069d01\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxq5z" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144060 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc0cfb45-abdd-434f-ae63-f6ae0fc7c092-serving-cert\") pod \"etcd-operator-b45778765-lrm9f\" (UID: \"cc0cfb45-abdd-434f-ae63-f6ae0fc7c092\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144075 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lspkm\" (UniqueName: \"kubernetes.io/projected/03e0f458-ccd0-429e-ae37-d4c1fd2946bf-kube-api-access-lspkm\") pod \"package-server-manager-789f6589d5-8pm55\" (UID: \"03e0f458-ccd0-429e-ae37-d4c1fd2946bf\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm55" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144095 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8ac39f2f-2411-4585-b15c-c473b2fdc077-auth-proxy-config\") pod \"machine-config-operator-74547568cd-n66gl\" (UID: \"8ac39f2f-2411-4585-b15c-c473b2fdc077\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144115 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6e00143-8d6c-45fb-aa6c-44015c27a3f1-serving-cert\") pod \"service-ca-operator-777779d784-s6zqk\" (UID: \"a6e00143-8d6c-45fb-aa6c-44015c27a3f1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-s6zqk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144134 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/25067bcc-8503-442b-b348-87d7e1321dbd-webhook-cert\") pod \"packageserver-d55dfcdfc-86dvk\" (UID: \"25067bcc-8503-442b-b348-87d7e1321dbd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144150 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/75c0e088-7bdf-47f4-b434-b184e742d40a-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-nwj8k\" (UID: \"75c0e088-7bdf-47f4-b434-b184e742d40a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144165 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc0cfb45-abdd-434f-ae63-f6ae0fc7c092-config\") pod \"etcd-operator-b45778765-lrm9f\" (UID: \"cc0cfb45-abdd-434f-ae63-f6ae0fc7c092\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144180 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/22087b1b-3ded-441f-8349-fb8f38809460-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-swblx\" (UID: \"22087b1b-3ded-441f-8349-fb8f38809460\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-swblx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144195 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fccce0ee-16e1-4237-8081-a6a3c93c5851-srv-cert\") pod \"olm-operator-6b444d44fb-5gg5l\" (UID: \"fccce0ee-16e1-4237-8081-a6a3c93c5851\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5gg5l" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144216 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ccf13312-4caa-4898-9dd3-3f9614ecee01-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144234 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmpqt\" (UniqueName: \"kubernetes.io/projected/48af697e-308a-4bdd-a5d8-d86cd5c4fb0c-kube-api-access-bmpqt\") pod \"control-plane-machine-set-operator-78cbb6b69f-jtj6g\" (UID: \"48af697e-308a-4bdd-a5d8-d86cd5c4fb0c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jtj6g" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144250 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp4c9\" (UniqueName: \"kubernetes.io/projected/5c00abc0-dc46-406c-8f2f-6904ac88126d-kube-api-access-qp4c9\") pod \"ingress-operator-5b745b69d9-rwv28\" (UID: \"5c00abc0-dc46-406c-8f2f-6904ac88126d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144265 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/90d203a9-910b-471c-afb5-e487b65136ac-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vxdw2\" (UID: \"90d203a9-910b-471c-afb5-e487b65136ac\") " pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144284 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6e00143-8d6c-45fb-aa6c-44015c27a3f1-config\") pod \"service-ca-operator-777779d784-s6zqk\" (UID: \"a6e00143-8d6c-45fb-aa6c-44015c27a3f1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-s6zqk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144298 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/61f96497-68d8-4347-b831-f7bc0204c677-signing-cabundle\") pod \"service-ca-9c57cc56f-7vkvw\" (UID: \"61f96497-68d8-4347-b831-f7bc0204c677\") " pod="openshift-service-ca/service-ca-9c57cc56f-7vkvw" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144335 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/22087b1b-3ded-441f-8349-fb8f38809460-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-swblx\" (UID: \"22087b1b-3ded-441f-8349-fb8f38809460\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-swblx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144351 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dqdc\" (UniqueName: \"kubernetes.io/projected/d390eca3-a064-441f-b469-3111e626bcae-kube-api-access-4dqdc\") pod \"collect-profiles-29483415-hwxpp\" (UID: \"d390eca3-a064-441f-b469-3111e626bcae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144369 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f55572f9-fbba-4efa-a6a8-94884f06f9c3-metrics-certs\") pod \"router-default-5444994796-5qcz5\" (UID: \"f55572f9-fbba-4efa-a6a8-94884f06f9c3\") " pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144384 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24clt\" (UniqueName: \"kubernetes.io/projected/a6e00143-8d6c-45fb-aa6c-44015c27a3f1-kube-api-access-24clt\") pod \"service-ca-operator-777779d784-s6zqk\" (UID: \"a6e00143-8d6c-45fb-aa6c-44015c27a3f1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-s6zqk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144400 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvv8z\" (UniqueName: \"kubernetes.io/projected/c3f6d778-ef18-4ad7-bd13-fb7e5983af23-kube-api-access-gvv8z\") pod \"ingress-canary-n9vh6\" (UID: \"c3f6d778-ef18-4ad7-bd13-fb7e5983af23\") " pod="openshift-ingress-canary/ingress-canary-n9vh6" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144417 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ccf13312-4caa-4898-9dd3-3f9614ecee01-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144451 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3f6d778-ef18-4ad7-bd13-fb7e5983af23-cert\") pod \"ingress-canary-n9vh6\" (UID: \"c3f6d778-ef18-4ad7-bd13-fb7e5983af23\") " pod="openshift-ingress-canary/ingress-canary-n9vh6" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144467 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cc0cfb45-abdd-434f-ae63-f6ae0fc7c092-etcd-client\") pod \"etcd-operator-b45778765-lrm9f\" (UID: \"cc0cfb45-abdd-434f-ae63-f6ae0fc7c092\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144492 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ac33402e-edb9-41ab-bb76-b17108b5ea0d-srv-cert\") pod \"catalog-operator-68c6474976-qkmbd\" (UID: \"ac33402e-edb9-41ab-bb76-b17108b5ea0d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkmbd" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144510 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvfxs\" (UniqueName: \"kubernetes.io/projected/bdfefc7f-6e59-460a-be36-220a37dd02d1-kube-api-access-xvfxs\") pod \"dns-default-njjgs\" (UID: \"bdfefc7f-6e59-460a-be36-220a37dd02d1\") " pod="openshift-dns/dns-default-njjgs" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144546 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/cc0cfb45-abdd-434f-ae63-f6ae0fc7c092-etcd-service-ca\") pod \"etcd-operator-b45778765-lrm9f\" (UID: \"cc0cfb45-abdd-434f-ae63-f6ae0fc7c092\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144563 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd1cfb10-4405-4ab9-8631-690622069d01-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gxq5z\" (UID: \"fd1cfb10-4405-4ab9-8631-690622069d01\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxq5z" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144578 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d390eca3-a064-441f-b469-3111e626bcae-config-volume\") pod \"collect-profiles-29483415-hwxpp\" (UID: \"d390eca3-a064-441f-b469-3111e626bcae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144596 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/139e5d2d-e00c-4dc9-a6d9-2edc4a09e5e6-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lhd59\" (UID: \"139e5d2d-e00c-4dc9-a6d9-2edc4a09e5e6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lhd59" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144611 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/cc0cfb45-abdd-434f-ae63-f6ae0fc7c092-etcd-ca\") pod \"etcd-operator-b45778765-lrm9f\" (UID: \"cc0cfb45-abdd-434f-ae63-f6ae0fc7c092\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144627 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/728ae7a4-9793-4555-abbb-b8a352700089-certs\") pod \"machine-config-server-tx54b\" (UID: \"728ae7a4-9793-4555-abbb-b8a352700089\") " pod="openshift-machine-config-operator/machine-config-server-tx54b" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144641 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/25067bcc-8503-442b-b348-87d7e1321dbd-tmpfs\") pod \"packageserver-d55dfcdfc-86dvk\" (UID: \"25067bcc-8503-442b-b348-87d7e1321dbd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144695 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/728ae7a4-9793-4555-abbb-b8a352700089-node-bootstrap-token\") pod \"machine-config-server-tx54b\" (UID: \"728ae7a4-9793-4555-abbb-b8a352700089\") " pod="openshift-machine-config-operator/machine-config-server-tx54b" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144716 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d390eca3-a064-441f-b469-3111e626bcae-secret-volume\") pod \"collect-profiles-29483415-hwxpp\" (UID: \"d390eca3-a064-441f-b469-3111e626bcae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144733 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ccf13312-4caa-4898-9dd3-3f9614ecee01-registry-certificates\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144757 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jcfd\" (UniqueName: \"kubernetes.io/projected/ccf13312-4caa-4898-9dd3-3f9614ecee01-kube-api-access-7jcfd\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: E0121 14:29:57.144950 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:29:57.644935414 +0000 UTC m=+35.553675346 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.146562 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29766114-9e0b-4064-8010-8f426935f834-config\") pod \"kube-apiserver-operator-766d6c64bb-gjtkx\" (UID: \"29766114-9e0b-4064-8010-8f426935f834\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gjtkx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.147165 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bdfefc7f-6e59-460a-be36-220a37dd02d1-config-volume\") pod \"dns-default-njjgs\" (UID: \"bdfefc7f-6e59-460a-be36-220a37dd02d1\") " pod="openshift-dns/dns-default-njjgs" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.148724 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-92xp4" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.152524 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4105ff8-7f4b-4dc9-8ceb-3d136236e8fb-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-gnrtf\" (UID: \"d4105ff8-7f4b-4dc9-8ceb-3d136236e8fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gnrtf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.153453 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-68kgl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.154558 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f55572f9-fbba-4efa-a6a8-94884f06f9c3-stats-auth\") pod \"router-default-5444994796-5qcz5\" (UID: \"f55572f9-fbba-4efa-a6a8-94884f06f9c3\") " pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.154741 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.153788 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ccf13312-4caa-4898-9dd3-3f9614ecee01-registry-certificates\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.155340 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29766114-9e0b-4064-8010-8f426935f834-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-gjtkx\" (UID: \"29766114-9e0b-4064-8010-8f426935f834\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gjtkx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.156439 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ccf13312-4caa-4898-9dd3-3f9614ecee01-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.159296 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f55572f9-fbba-4efa-a6a8-94884f06f9c3-service-ca-bundle\") pod \"router-default-5444994796-5qcz5\" (UID: \"f55572f9-fbba-4efa-a6a8-94884f06f9c3\") " pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.160318 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.166061 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4105ff8-7f4b-4dc9-8ceb-3d136236e8fb-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-gnrtf\" (UID: \"d4105ff8-7f4b-4dc9-8ceb-3d136236e8fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gnrtf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.166138 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f55572f9-fbba-4efa-a6a8-94884f06f9c3-metrics-certs\") pod \"router-default-5444994796-5qcz5\" (UID: \"f55572f9-fbba-4efa-a6a8-94884f06f9c3\") " pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.167117 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f55572f9-fbba-4efa-a6a8-94884f06f9c3-default-certificate\") pod \"router-default-5444994796-5qcz5\" (UID: \"f55572f9-fbba-4efa-a6a8-94884f06f9c3\") " pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.167468 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.168563 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ccf13312-4caa-4898-9dd3-3f9614ecee01-trusted-ca\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.169769 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ccf13312-4caa-4898-9dd3-3f9614ecee01-registry-tls\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.169893 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bdfefc7f-6e59-460a-be36-220a37dd02d1-metrics-tls\") pod \"dns-default-njjgs\" (UID: \"bdfefc7f-6e59-460a-be36-220a37dd02d1\") " pod="openshift-dns/dns-default-njjgs" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.170023 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ccf13312-4caa-4898-9dd3-3f9614ecee01-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.172194 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.173977 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-wmxb9" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.188646 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.196329 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nscjl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.206304 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ac33402e-edb9-41ab-bb76-b17108b5ea0d-srv-cert\") pod \"catalog-operator-68c6474976-qkmbd\" (UID: \"ac33402e-edb9-41ab-bb76-b17108b5ea0d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkmbd" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.212373 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.249250 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e675e6aa-6d61-4490-b768-1dbe664d1dfe-plugins-dir\") pod \"csi-hostpathplugin-j577t\" (UID: \"e675e6aa-6d61-4490-b768-1dbe664d1dfe\") " pod="hostpath-provisioner/csi-hostpathplugin-j577t" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.249307 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1796695a-873c-4c15-9351-9b5bc5607830-proxy-tls\") pod \"machine-config-controller-84d6567774-zxv2h\" (UID: \"1796695a-873c-4c15-9351-9b5bc5607830\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zxv2h" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.249354 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/22087b1b-3ded-441f-8349-fb8f38809460-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-swblx\" (UID: \"22087b1b-3ded-441f-8349-fb8f38809460\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-swblx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.249378 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7dq7\" (UniqueName: \"kubernetes.io/projected/90d203a9-910b-471c-afb5-e487b65136ac-kube-api-access-q7dq7\") pod \"marketplace-operator-79b997595-vxdw2\" (UID: \"90d203a9-910b-471c-afb5-e487b65136ac\") " pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.249406 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e675e6aa-6d61-4490-b768-1dbe664d1dfe-mountpoint-dir\") pod \"csi-hostpathplugin-j577t\" (UID: \"e675e6aa-6d61-4490-b768-1dbe664d1dfe\") " pod="hostpath-provisioner/csi-hostpathplugin-j577t" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.249576 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e675e6aa-6d61-4490-b768-1dbe664d1dfe-plugins-dir\") pod \"csi-hostpathplugin-j577t\" (UID: \"e675e6aa-6d61-4490-b768-1dbe664d1dfe\") " pod="hostpath-provisioner/csi-hostpathplugin-j577t" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.249682 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c868m\" (UniqueName: \"kubernetes.io/projected/e675e6aa-6d61-4490-b768-1dbe664d1dfe-kube-api-access-c868m\") pod \"csi-hostpathplugin-j577t\" (UID: \"e675e6aa-6d61-4490-b768-1dbe664d1dfe\") " pod="hostpath-provisioner/csi-hostpathplugin-j577t" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.249706 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/92d3c944-8def-4f95-a3cb-781f929f5f28-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7mfnf\" (UID: \"92d3c944-8def-4f95-a3cb-781f929f5f28\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7mfnf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.249785 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e675e6aa-6d61-4490-b768-1dbe664d1dfe-socket-dir\") pod \"csi-hostpathplugin-j577t\" (UID: \"e675e6aa-6d61-4490-b768-1dbe664d1dfe\") " pod="hostpath-provisioner/csi-hostpathplugin-j577t" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.249806 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5c00abc0-dc46-406c-8f2f-6904ac88126d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rwv28\" (UID: \"5c00abc0-dc46-406c-8f2f-6904ac88126d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.249821 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e675e6aa-6d61-4490-b768-1dbe664d1dfe-csi-data-dir\") pod \"csi-hostpathplugin-j577t\" (UID: \"e675e6aa-6d61-4490-b768-1dbe664d1dfe\") " pod="hostpath-provisioner/csi-hostpathplugin-j577t" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.249877 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/75c0e088-7bdf-47f4-b434-b184e742d40a-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-nwj8k\" (UID: \"75c0e088-7bdf-47f4-b434-b184e742d40a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.249920 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/75c0e088-7bdf-47f4-b434-b184e742d40a-ready\") pod \"cni-sysctl-allowlist-ds-nwj8k\" (UID: \"75c0e088-7bdf-47f4-b434-b184e742d40a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.249966 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dddnq\" (UniqueName: \"kubernetes.io/projected/8ac39f2f-2411-4585-b15c-c473b2fdc077-kube-api-access-dddnq\") pod \"machine-config-operator-74547568cd-n66gl\" (UID: \"8ac39f2f-2411-4585-b15c-c473b2fdc077\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.250077 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72vzw\" (UniqueName: \"kubernetes.io/projected/22087b1b-3ded-441f-8349-fb8f38809460-kube-api-access-72vzw\") pod \"cluster-image-registry-operator-dc59b4c8b-swblx\" (UID: \"22087b1b-3ded-441f-8349-fb8f38809460\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-swblx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.250133 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd1cfb10-4405-4ab9-8631-690622069d01-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gxq5z\" (UID: \"fd1cfb10-4405-4ab9-8631-690622069d01\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxq5z" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.250154 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.250195 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4fsv\" (UniqueName: \"kubernetes.io/projected/25067bcc-8503-442b-b348-87d7e1321dbd-kube-api-access-l4fsv\") pod \"packageserver-d55dfcdfc-86dvk\" (UID: \"25067bcc-8503-442b-b348-87d7e1321dbd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.250213 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssqc7\" (UniqueName: \"kubernetes.io/projected/61f96497-68d8-4347-b831-f7bc0204c677-kube-api-access-ssqc7\") pod \"service-ca-9c57cc56f-7vkvw\" (UID: \"61f96497-68d8-4347-b831-f7bc0204c677\") " pod="openshift-service-ca/service-ca-9c57cc56f-7vkvw" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.250233 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwqhl\" (UniqueName: \"kubernetes.io/projected/1796695a-873c-4c15-9351-9b5bc5607830-kube-api-access-zwqhl\") pod \"machine-config-controller-84d6567774-zxv2h\" (UID: \"1796695a-873c-4c15-9351-9b5bc5607830\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zxv2h" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.250272 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7xsv\" (UniqueName: \"kubernetes.io/projected/92d3c944-8def-4f95-a3cb-781f929f5f28-kube-api-access-t7xsv\") pod \"multus-admission-controller-857f4d67dd-7mfnf\" (UID: \"92d3c944-8def-4f95-a3cb-781f929f5f28\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7mfnf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.250291 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8ac39f2f-2411-4585-b15c-c473b2fdc077-images\") pod \"machine-config-operator-74547568cd-n66gl\" (UID: \"8ac39f2f-2411-4585-b15c-c473b2fdc077\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.250321 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/25067bcc-8503-442b-b348-87d7e1321dbd-apiservice-cert\") pod \"packageserver-d55dfcdfc-86dvk\" (UID: \"25067bcc-8503-442b-b348-87d7e1321dbd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.250481 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e675e6aa-6d61-4490-b768-1dbe664d1dfe-csi-data-dir\") pod \"csi-hostpathplugin-j577t\" (UID: \"e675e6aa-6d61-4490-b768-1dbe664d1dfe\") " pod="hostpath-provisioner/csi-hostpathplugin-j577t" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.250602 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 21 14:29:57 crc kubenswrapper[4720]: E0121 14:29:57.251148 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:29:57.751135753 +0000 UTC m=+35.659875685 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.251379 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tg5ng\" (UniqueName: \"kubernetes.io/projected/fccce0ee-16e1-4237-8081-a6a3c93c5851-kube-api-access-tg5ng\") pod \"olm-operator-6b444d44fb-5gg5l\" (UID: \"fccce0ee-16e1-4237-8081-a6a3c93c5851\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5gg5l" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.251519 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd1cfb10-4405-4ab9-8631-690622069d01-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gxq5z\" (UID: \"fd1cfb10-4405-4ab9-8631-690622069d01\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxq5z" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.251534 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e675e6aa-6d61-4490-b768-1dbe664d1dfe-socket-dir\") pod \"csi-hostpathplugin-j577t\" (UID: \"e675e6aa-6d61-4490-b768-1dbe664d1dfe\") " pod="hostpath-provisioner/csi-hostpathplugin-j577t" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.251545 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc0cfb45-abdd-434f-ae63-f6ae0fc7c092-serving-cert\") pod \"etcd-operator-b45778765-lrm9f\" (UID: \"cc0cfb45-abdd-434f-ae63-f6ae0fc7c092\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.251668 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6e00143-8d6c-45fb-aa6c-44015c27a3f1-serving-cert\") pod \"service-ca-operator-777779d784-s6zqk\" (UID: \"a6e00143-8d6c-45fb-aa6c-44015c27a3f1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-s6zqk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.252822 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e675e6aa-6d61-4490-b768-1dbe664d1dfe-mountpoint-dir\") pod \"csi-hostpathplugin-j577t\" (UID: \"e675e6aa-6d61-4490-b768-1dbe664d1dfe\") " pod="hostpath-provisioner/csi-hostpathplugin-j577t" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.252994 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s999\" (UniqueName: \"kubernetes.io/projected/61315eef-fa85-4828-9668-f6f4b1484453-kube-api-access-6s999\") pod \"openshift-apiserver-operator-796bbdcf4f-p6shd\" (UID: \"61315eef-fa85-4828-9668-f6f4b1484453\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p6shd" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.258862 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/25067bcc-8503-442b-b348-87d7e1321dbd-webhook-cert\") pod \"packageserver-d55dfcdfc-86dvk\" (UID: \"25067bcc-8503-442b-b348-87d7e1321dbd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.258909 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lspkm\" (UniqueName: \"kubernetes.io/projected/03e0f458-ccd0-429e-ae37-d4c1fd2946bf-kube-api-access-lspkm\") pod \"package-server-manager-789f6589d5-8pm55\" (UID: \"03e0f458-ccd0-429e-ae37-d4c1fd2946bf\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm55" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.258932 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8ac39f2f-2411-4585-b15c-c473b2fdc077-auth-proxy-config\") pod \"machine-config-operator-74547568cd-n66gl\" (UID: \"8ac39f2f-2411-4585-b15c-c473b2fdc077\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.258949 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/22087b1b-3ded-441f-8349-fb8f38809460-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-swblx\" (UID: \"22087b1b-3ded-441f-8349-fb8f38809460\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-swblx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.258964 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fccce0ee-16e1-4237-8081-a6a3c93c5851-srv-cert\") pod \"olm-operator-6b444d44fb-5gg5l\" (UID: \"fccce0ee-16e1-4237-8081-a6a3c93c5851\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5gg5l" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.258980 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/75c0e088-7bdf-47f4-b434-b184e742d40a-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-nwj8k\" (UID: \"75c0e088-7bdf-47f4-b434-b184e742d40a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259001 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc0cfb45-abdd-434f-ae63-f6ae0fc7c092-config\") pod \"etcd-operator-b45778765-lrm9f\" (UID: \"cc0cfb45-abdd-434f-ae63-f6ae0fc7c092\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259026 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmpqt\" (UniqueName: \"kubernetes.io/projected/48af697e-308a-4bdd-a5d8-d86cd5c4fb0c-kube-api-access-bmpqt\") pod \"control-plane-machine-set-operator-78cbb6b69f-jtj6g\" (UID: \"48af697e-308a-4bdd-a5d8-d86cd5c4fb0c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jtj6g" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259042 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp4c9\" (UniqueName: \"kubernetes.io/projected/5c00abc0-dc46-406c-8f2f-6904ac88126d-kube-api-access-qp4c9\") pod \"ingress-operator-5b745b69d9-rwv28\" (UID: \"5c00abc0-dc46-406c-8f2f-6904ac88126d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259057 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/90d203a9-910b-471c-afb5-e487b65136ac-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vxdw2\" (UID: \"90d203a9-910b-471c-afb5-e487b65136ac\") " pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259079 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/22087b1b-3ded-441f-8349-fb8f38809460-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-swblx\" (UID: \"22087b1b-3ded-441f-8349-fb8f38809460\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-swblx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259097 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6e00143-8d6c-45fb-aa6c-44015c27a3f1-config\") pod \"service-ca-operator-777779d784-s6zqk\" (UID: \"a6e00143-8d6c-45fb-aa6c-44015c27a3f1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-s6zqk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259115 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/61f96497-68d8-4347-b831-f7bc0204c677-signing-cabundle\") pod \"service-ca-9c57cc56f-7vkvw\" (UID: \"61f96497-68d8-4347-b831-f7bc0204c677\") " pod="openshift-service-ca/service-ca-9c57cc56f-7vkvw" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259138 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dqdc\" (UniqueName: \"kubernetes.io/projected/d390eca3-a064-441f-b469-3111e626bcae-kube-api-access-4dqdc\") pod \"collect-profiles-29483415-hwxpp\" (UID: \"d390eca3-a064-441f-b469-3111e626bcae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259156 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24clt\" (UniqueName: \"kubernetes.io/projected/a6e00143-8d6c-45fb-aa6c-44015c27a3f1-kube-api-access-24clt\") pod \"service-ca-operator-777779d784-s6zqk\" (UID: \"a6e00143-8d6c-45fb-aa6c-44015c27a3f1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-s6zqk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259175 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvv8z\" (UniqueName: \"kubernetes.io/projected/c3f6d778-ef18-4ad7-bd13-fb7e5983af23-kube-api-access-gvv8z\") pod \"ingress-canary-n9vh6\" (UID: \"c3f6d778-ef18-4ad7-bd13-fb7e5983af23\") " pod="openshift-ingress-canary/ingress-canary-n9vh6" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259192 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3f6d778-ef18-4ad7-bd13-fb7e5983af23-cert\") pod \"ingress-canary-n9vh6\" (UID: \"c3f6d778-ef18-4ad7-bd13-fb7e5983af23\") " pod="openshift-ingress-canary/ingress-canary-n9vh6" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259218 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cc0cfb45-abdd-434f-ae63-f6ae0fc7c092-etcd-client\") pod \"etcd-operator-b45778765-lrm9f\" (UID: \"cc0cfb45-abdd-434f-ae63-f6ae0fc7c092\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259255 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/cc0cfb45-abdd-434f-ae63-f6ae0fc7c092-etcd-service-ca\") pod \"etcd-operator-b45778765-lrm9f\" (UID: \"cc0cfb45-abdd-434f-ae63-f6ae0fc7c092\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259273 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd1cfb10-4405-4ab9-8631-690622069d01-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gxq5z\" (UID: \"fd1cfb10-4405-4ab9-8631-690622069d01\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxq5z" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259290 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d390eca3-a064-441f-b469-3111e626bcae-config-volume\") pod \"collect-profiles-29483415-hwxpp\" (UID: \"d390eca3-a064-441f-b469-3111e626bcae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259311 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/cc0cfb45-abdd-434f-ae63-f6ae0fc7c092-etcd-ca\") pod \"etcd-operator-b45778765-lrm9f\" (UID: \"cc0cfb45-abdd-434f-ae63-f6ae0fc7c092\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259329 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/728ae7a4-9793-4555-abbb-b8a352700089-certs\") pod \"machine-config-server-tx54b\" (UID: \"728ae7a4-9793-4555-abbb-b8a352700089\") " pod="openshift-machine-config-operator/machine-config-server-tx54b" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259346 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/25067bcc-8503-442b-b348-87d7e1321dbd-tmpfs\") pod \"packageserver-d55dfcdfc-86dvk\" (UID: \"25067bcc-8503-442b-b348-87d7e1321dbd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259362 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/728ae7a4-9793-4555-abbb-b8a352700089-node-bootstrap-token\") pod \"machine-config-server-tx54b\" (UID: \"728ae7a4-9793-4555-abbb-b8a352700089\") " pod="openshift-machine-config-operator/machine-config-server-tx54b" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259380 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d390eca3-a064-441f-b469-3111e626bcae-secret-volume\") pod \"collect-profiles-29483415-hwxpp\" (UID: \"d390eca3-a064-441f-b469-3111e626bcae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259403 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/90d203a9-910b-471c-afb5-e487b65136ac-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vxdw2\" (UID: \"90d203a9-910b-471c-afb5-e487b65136ac\") " pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259422 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c00abc0-dc46-406c-8f2f-6904ac88126d-trusted-ca\") pod \"ingress-operator-5b745b69d9-rwv28\" (UID: \"5c00abc0-dc46-406c-8f2f-6904ac88126d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259445 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/03e0f458-ccd0-429e-ae37-d4c1fd2946bf-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-8pm55\" (UID: \"03e0f458-ccd0-429e-ae37-d4c1fd2946bf\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm55" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259464 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5n4db\" (UniqueName: \"kubernetes.io/projected/cc0cfb45-abdd-434f-ae63-f6ae0fc7c092-kube-api-access-5n4db\") pod \"etcd-operator-b45778765-lrm9f\" (UID: \"cc0cfb45-abdd-434f-ae63-f6ae0fc7c092\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259487 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e675e6aa-6d61-4490-b768-1dbe664d1dfe-registration-dir\") pod \"csi-hostpathplugin-j577t\" (UID: \"e675e6aa-6d61-4490-b768-1dbe664d1dfe\") " pod="hostpath-provisioner/csi-hostpathplugin-j577t" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259507 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/61f96497-68d8-4347-b831-f7bc0204c677-signing-key\") pod \"service-ca-9c57cc56f-7vkvw\" (UID: \"61f96497-68d8-4347-b831-f7bc0204c677\") " pod="openshift-service-ca/service-ca-9c57cc56f-7vkvw" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259524 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8ac39f2f-2411-4585-b15c-c473b2fdc077-proxy-tls\") pod \"machine-config-operator-74547568cd-n66gl\" (UID: \"8ac39f2f-2411-4585-b15c-c473b2fdc077\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259539 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/48af697e-308a-4bdd-a5d8-d86cd5c4fb0c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jtj6g\" (UID: \"48af697e-308a-4bdd-a5d8-d86cd5c4fb0c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jtj6g" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259554 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5c00abc0-dc46-406c-8f2f-6904ac88126d-metrics-tls\") pod \"ingress-operator-5b745b69d9-rwv28\" (UID: \"5c00abc0-dc46-406c-8f2f-6904ac88126d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259570 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97vbj\" (UniqueName: \"kubernetes.io/projected/4e042627-4d69-4cc5-a00d-849fe4ce76f0-kube-api-access-97vbj\") pod \"migrator-59844c95c7-d7wmg\" (UID: \"4e042627-4d69-4cc5-a00d-849fe4ce76f0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-d7wmg" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259588 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1796695a-873c-4c15-9351-9b5bc5607830-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-zxv2h\" (UID: \"1796695a-873c-4c15-9351-9b5bc5607830\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zxv2h" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259606 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj5wb\" (UniqueName: \"kubernetes.io/projected/728ae7a4-9793-4555-abbb-b8a352700089-kube-api-access-lj5wb\") pod \"machine-config-server-tx54b\" (UID: \"728ae7a4-9793-4555-abbb-b8a352700089\") " pod="openshift-machine-config-operator/machine-config-server-tx54b" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259627 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fccce0ee-16e1-4237-8081-a6a3c93c5851-profile-collector-cert\") pod \"olm-operator-6b444d44fb-5gg5l\" (UID: \"fccce0ee-16e1-4237-8081-a6a3c93c5851\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5gg5l" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259644 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cdm8\" (UniqueName: \"kubernetes.io/projected/75c0e088-7bdf-47f4-b434-b184e742d40a-kube-api-access-2cdm8\") pod \"cni-sysctl-allowlist-ds-nwj8k\" (UID: \"75c0e088-7bdf-47f4-b434-b184e742d40a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259996 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/75c0e088-7bdf-47f4-b434-b184e742d40a-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-nwj8k\" (UID: \"75c0e088-7bdf-47f4-b434-b184e742d40a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.260625 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd1cfb10-4405-4ab9-8631-690622069d01-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gxq5z\" (UID: \"fd1cfb10-4405-4ab9-8631-690622069d01\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxq5z" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.260787 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc0cfb45-abdd-434f-ae63-f6ae0fc7c092-config\") pod \"etcd-operator-b45778765-lrm9f\" (UID: \"cc0cfb45-abdd-434f-ae63-f6ae0fc7c092\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.260811 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8ac39f2f-2411-4585-b15c-c473b2fdc077-auth-proxy-config\") pod \"machine-config-operator-74547568cd-n66gl\" (UID: \"8ac39f2f-2411-4585-b15c-c473b2fdc077\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.261075 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8ac39f2f-2411-4585-b15c-c473b2fdc077-images\") pod \"machine-config-operator-74547568cd-n66gl\" (UID: \"8ac39f2f-2411-4585-b15c-c473b2fdc077\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.264958 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cc0cfb45-abdd-434f-ae63-f6ae0fc7c092-etcd-client\") pod \"etcd-operator-b45778765-lrm9f\" (UID: \"cc0cfb45-abdd-434f-ae63-f6ae0fc7c092\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.266220 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/cc0cfb45-abdd-434f-ae63-f6ae0fc7c092-etcd-ca\") pod \"etcd-operator-b45778765-lrm9f\" (UID: \"cc0cfb45-abdd-434f-ae63-f6ae0fc7c092\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.266226 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/92d3c944-8def-4f95-a3cb-781f929f5f28-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7mfnf\" (UID: \"92d3c944-8def-4f95-a3cb-781f929f5f28\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7mfnf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.267390 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/75c0e088-7bdf-47f4-b434-b184e742d40a-ready\") pod \"cni-sysctl-allowlist-ds-nwj8k\" (UID: \"75c0e088-7bdf-47f4-b434-b184e742d40a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.268160 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/cc0cfb45-abdd-434f-ae63-f6ae0fc7c092-etcd-service-ca\") pod \"etcd-operator-b45778765-lrm9f\" (UID: \"cc0cfb45-abdd-434f-ae63-f6ae0fc7c092\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.268295 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/22087b1b-3ded-441f-8349-fb8f38809460-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-swblx\" (UID: \"22087b1b-3ded-441f-8349-fb8f38809460\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-swblx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.268394 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e675e6aa-6d61-4490-b768-1dbe664d1dfe-registration-dir\") pod \"csi-hostpathplugin-j577t\" (UID: \"e675e6aa-6d61-4490-b768-1dbe664d1dfe\") " pod="hostpath-provisioner/csi-hostpathplugin-j577t" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.269672 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/25067bcc-8503-442b-b348-87d7e1321dbd-tmpfs\") pod \"packageserver-d55dfcdfc-86dvk\" (UID: \"25067bcc-8503-442b-b348-87d7e1321dbd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.270620 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/90d203a9-910b-471c-afb5-e487b65136ac-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vxdw2\" (UID: \"90d203a9-910b-471c-afb5-e487b65136ac\") " pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.271554 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c00abc0-dc46-406c-8f2f-6904ac88126d-trusted-ca\") pod \"ingress-operator-5b745b69d9-rwv28\" (UID: \"5c00abc0-dc46-406c-8f2f-6904ac88126d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.273133 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1796695a-873c-4c15-9351-9b5bc5607830-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-zxv2h\" (UID: \"1796695a-873c-4c15-9351-9b5bc5607830\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zxv2h" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.273536 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.276386 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/139e5d2d-e00c-4dc9-a6d9-2edc4a09e5e6-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lhd59\" (UID: \"139e5d2d-e00c-4dc9-a6d9-2edc4a09e5e6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lhd59" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.284294 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fccce0ee-16e1-4237-8081-a6a3c93c5851-srv-cert\") pod \"olm-operator-6b444d44fb-5gg5l\" (UID: \"fccce0ee-16e1-4237-8081-a6a3c93c5851\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5gg5l" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.287480 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc0cfb45-abdd-434f-ae63-f6ae0fc7c092-serving-cert\") pod \"etcd-operator-b45778765-lrm9f\" (UID: \"cc0cfb45-abdd-434f-ae63-f6ae0fc7c092\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.287847 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/90d203a9-910b-471c-afb5-e487b65136ac-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vxdw2\" (UID: \"90d203a9-910b-471c-afb5-e487b65136ac\") " pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.289756 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5c00abc0-dc46-406c-8f2f-6904ac88126d-metrics-tls\") pod \"ingress-operator-5b745b69d9-rwv28\" (UID: \"5c00abc0-dc46-406c-8f2f-6904ac88126d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.290134 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8ac39f2f-2411-4585-b15c-c473b2fdc077-proxy-tls\") pod \"machine-config-operator-74547568cd-n66gl\" (UID: \"8ac39f2f-2411-4585-b15c-c473b2fdc077\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.290348 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-v2pht" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.290795 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.291490 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/48af697e-308a-4bdd-a5d8-d86cd5c4fb0c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jtj6g\" (UID: \"48af697e-308a-4bdd-a5d8-d86cd5c4fb0c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jtj6g" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.293964 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/25067bcc-8503-442b-b348-87d7e1321dbd-webhook-cert\") pod \"packageserver-d55dfcdfc-86dvk\" (UID: \"25067bcc-8503-442b-b348-87d7e1321dbd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.297033 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd1cfb10-4405-4ab9-8631-690622069d01-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gxq5z\" (UID: \"fd1cfb10-4405-4ab9-8631-690622069d01\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxq5z" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.297315 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1796695a-873c-4c15-9351-9b5bc5607830-proxy-tls\") pod \"machine-config-controller-84d6567774-zxv2h\" (UID: \"1796695a-873c-4c15-9351-9b5bc5607830\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zxv2h" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.297801 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/25067bcc-8503-442b-b348-87d7e1321dbd-apiservice-cert\") pod \"packageserver-d55dfcdfc-86dvk\" (UID: \"25067bcc-8503-442b-b348-87d7e1321dbd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.298022 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ac33402e-edb9-41ab-bb76-b17108b5ea0d-profile-collector-cert\") pod \"catalog-operator-68c6474976-qkmbd\" (UID: \"ac33402e-edb9-41ab-bb76-b17108b5ea0d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkmbd" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.301556 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/03e0f458-ccd0-429e-ae37-d4c1fd2946bf-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-8pm55\" (UID: \"03e0f458-ccd0-429e-ae37-d4c1fd2946bf\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm55" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.303117 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fccce0ee-16e1-4237-8081-a6a3c93c5851-profile-collector-cert\") pod \"olm-operator-6b444d44fb-5gg5l\" (UID: \"fccce0ee-16e1-4237-8081-a6a3c93c5851\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5gg5l" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.307371 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.309641 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d390eca3-a064-441f-b469-3111e626bcae-secret-volume\") pod \"collect-profiles-29483415-hwxpp\" (UID: \"d390eca3-a064-441f-b469-3111e626bcae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.317674 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/139e5d2d-e00c-4dc9-a6d9-2edc4a09e5e6-config\") pod \"kube-controller-manager-operator-78b949d7b-lhd59\" (UID: \"139e5d2d-e00c-4dc9-a6d9-2edc4a09e5e6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lhd59" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.331454 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.359473 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.361216 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:57 crc kubenswrapper[4720]: E0121 14:29:57.361736 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:29:57.861721904 +0000 UTC m=+35.770461836 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.367702 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p6shd" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.368640 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.387997 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.404390 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/728ae7a4-9793-4555-abbb-b8a352700089-certs\") pod \"machine-config-server-tx54b\" (UID: \"728ae7a4-9793-4555-abbb-b8a352700089\") " pod="openshift-machine-config-operator/machine-config-server-tx54b" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.409515 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.421051 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/61f96497-68d8-4347-b831-f7bc0204c677-signing-key\") pod \"service-ca-9c57cc56f-7vkvw\" (UID: \"61f96497-68d8-4347-b831-f7bc0204c677\") " pod="openshift-service-ca/service-ca-9c57cc56f-7vkvw" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.429705 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.447738 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.451997 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/61f96497-68d8-4347-b831-f7bc0204c677-signing-cabundle\") pod \"service-ca-9c57cc56f-7vkvw\" (UID: \"61f96497-68d8-4347-b831-f7bc0204c677\") " pod="openshift-service-ca/service-ca-9c57cc56f-7vkvw" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.464295 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: E0121 14:29:57.464674 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:29:57.96464192 +0000 UTC m=+35.873381862 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.468019 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.489081 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.507736 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.514123 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/728ae7a4-9793-4555-abbb-b8a352700089-node-bootstrap-token\") pod \"machine-config-server-tx54b\" (UID: \"728ae7a4-9793-4555-abbb-b8a352700089\") " pod="openshift-machine-config-operator/machine-config-server-tx54b" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.565543 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:57 crc kubenswrapper[4720]: E0121 14:29:57.566028 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:29:58.066006013 +0000 UTC m=+35.974745945 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.567942 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.568891 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: E0121 14:29:57.569551 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:29:58.069536261 +0000 UTC m=+35.978276193 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.573825 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d390eca3-a064-441f-b469-3111e626bcae-config-volume\") pod \"collect-profiles-29483415-hwxpp\" (UID: \"d390eca3-a064-441f-b469-3111e626bcae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.584850 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m254w\" (UniqueName: \"kubernetes.io/projected/0f685084-f748-4a34-9020-4d562f2a6d45-kube-api-access-m254w\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.589127 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9gr25"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.608523 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz7tz\" (UniqueName: \"kubernetes.io/projected/9a655c79-a709-4d61-8209-200b86144e8b-kube-api-access-lz7tz\") pod \"dns-operator-744455d44c-zvq7p\" (UID: \"9a655c79-a709-4d61-8209-200b86144e8b\") " pod="openshift-dns-operator/dns-operator-744455d44c-zvq7p" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.612164 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.628510 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.651098 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.660620 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mxvkb"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.670039 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.670214 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 21 14:29:57 crc kubenswrapper[4720]: E0121 14:29:57.670629 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:29:58.170609296 +0000 UTC m=+36.079349228 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.672915 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-h9ckd"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.678476 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6e00143-8d6c-45fb-aa6c-44015c27a3f1-config\") pod \"service-ca-operator-777779d784-s6zqk\" (UID: \"a6e00143-8d6c-45fb-aa6c-44015c27a3f1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-s6zqk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.691917 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.696232 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7xcc8"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.699376 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.709416 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.712356 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-zvq7p" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.723087 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6e00143-8d6c-45fb-aa6c-44015c27a3f1-serving-cert\") pod \"service-ca-operator-777779d784-s6zqk\" (UID: \"a6e00143-8d6c-45fb-aa6c-44015c27a3f1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-s6zqk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.728430 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.748321 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.763203 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/22087b1b-3ded-441f-8349-fb8f38809460-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-swblx\" (UID: \"22087b1b-3ded-441f-8349-fb8f38809460\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-swblx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.770809 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.772081 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: E0121 14:29:57.774953 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:29:58.274935021 +0000 UTC m=+36.183674943 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.776505 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/75c0e088-7bdf-47f4-b434-b184e742d40a-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-nwj8k\" (UID: \"75c0e088-7bdf-47f4-b434-b184e742d40a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.790243 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.807271 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.834871 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.843566 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" event={"ID":"45b6b4eb-147f-485e-96e1-5b08ee85ee9f","Type":"ContainerStarted","Data":"30cad834f566f85c0f3a6de4d149c40b4e51c114cf6d66d633ef1b6be4e13903"} Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.843775 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3f6d778-ef18-4ad7-bd13-fb7e5983af23-cert\") pod \"ingress-canary-n9vh6\" (UID: \"c3f6d778-ef18-4ad7-bd13-fb7e5983af23\") " pod="openshift-ingress-canary/ingress-canary-n9vh6" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.847782 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.849581 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" event={"ID":"03eab9ba-e390-43a8-ab91-b8f0fe8678a0","Type":"ContainerStarted","Data":"1a03a4355bd12eae90e463960102d7b8d0f28a5a014b426c9235206feb008d3a"} Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.851795 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-h9ckd" event={"ID":"1a75d5de-a507-41ca-8206-eae702d16020","Type":"ContainerStarted","Data":"ea6b107f3c3026106a39c3caf82ba6fa45ff98f05ec346681fa1da42911087dc"} Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.856438 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cnk8x" event={"ID":"b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317","Type":"ContainerStarted","Data":"2d808d09bbc7f5cb2b76e9766c449e0d4ba9970ce27620ef9bfe40a6dd0a49ae"} Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.873074 4720 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.888900 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.907296 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:57 crc kubenswrapper[4720]: E0121 14:29:57.907880 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:29:58.407860609 +0000 UTC m=+36.316600541 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.909204 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.954693 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jcfd\" (UniqueName: \"kubernetes.io/projected/ccf13312-4caa-4898-9dd3-3f9614ecee01-kube-api-access-7jcfd\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: W0121 14:29:57.987391 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-9d74757fc7f58cf1875d3f50e447fa64857a292eac2b4e0e9c52c2afee2bb99d WatchSource:0}: Error finding container 9d74757fc7f58cf1875d3f50e447fa64857a292eac2b4e0e9c52c2afee2bb99d: Status 404 returned error can't find the container with id 9d74757fc7f58cf1875d3f50e447fa64857a292eac2b4e0e9c52c2afee2bb99d Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.988202 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fqt6\" (UniqueName: \"kubernetes.io/projected/d4105ff8-7f4b-4dc9-8ceb-3d136236e8fb-kube-api-access-6fqt6\") pod \"kube-storage-version-migrator-operator-b67b599dd-gnrtf\" (UID: \"d4105ff8-7f4b-4dc9-8ceb-3d136236e8fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gnrtf" Jan 21 14:29:57 crc kubenswrapper[4720]: W0121 14:29:57.990087 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-7be625850e4e33be532147f3db97270396571f8fca40f191e7ed9c2f68ac3408 WatchSource:0}: Error finding container 7be625850e4e33be532147f3db97270396571f8fca40f191e7ed9c2f68ac3408: Status 404 returned error can't find the container with id 7be625850e4e33be532147f3db97270396571f8fca40f191e7ed9c2f68ac3408 Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.994903 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q4rp\" (UniqueName: \"kubernetes.io/projected/ac33402e-edb9-41ab-bb76-b17108b5ea0d-kube-api-access-2q4rp\") pod \"catalog-operator-68c6474976-qkmbd\" (UID: \"ac33402e-edb9-41ab-bb76-b17108b5ea0d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkmbd" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.007792 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/139e5d2d-e00c-4dc9-a6d9-2edc4a09e5e6-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lhd59\" (UID: \"139e5d2d-e00c-4dc9-a6d9-2edc4a09e5e6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lhd59" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.010210 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:58 crc kubenswrapper[4720]: E0121 14:29:58.010911 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:29:58.510893999 +0000 UTC m=+36.419633931 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.011547 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-68kgl"] Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.027049 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-92xp4"] Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.061695 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-wmxb9"] Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.064821 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28j2h\" (UniqueName: \"kubernetes.io/projected/f55572f9-fbba-4efa-a6a8-94884f06f9c3-kube-api-access-28j2h\") pod \"router-default-5444994796-5qcz5\" (UID: \"f55572f9-fbba-4efa-a6a8-94884f06f9c3\") " pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.070709 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29766114-9e0b-4064-8010-8f426935f834-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-gjtkx\" (UID: \"29766114-9e0b-4064-8010-8f426935f834\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gjtkx" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.076785 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvfxs\" (UniqueName: \"kubernetes.io/projected/bdfefc7f-6e59-460a-be36-220a37dd02d1-kube-api-access-xvfxs\") pod \"dns-default-njjgs\" (UID: \"bdfefc7f-6e59-460a-be36-220a37dd02d1\") " pod="openshift-dns/dns-default-njjgs" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.089792 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-njjgs" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.092202 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh"] Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.116192 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:58 crc kubenswrapper[4720]: E0121 14:29:58.116845 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:29:58.616808419 +0000 UTC m=+36.525548351 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.117629 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:58 crc kubenswrapper[4720]: E0121 14:29:58.117963 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:29:58.617947511 +0000 UTC m=+36.526687443 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.118078 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.122686 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ccf13312-4caa-4898-9dd3-3f9614ecee01-bound-sa-token\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.137428 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gnrtf" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.137833 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-v2pht"] Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.138644 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk"] Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.144113 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-42g76"] Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.147461 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd1cfb10-4405-4ab9-8631-690622069d01-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gxq5z\" (UID: \"fd1cfb10-4405-4ab9-8631-690622069d01\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxq5z" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.149754 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssqc7\" (UniqueName: \"kubernetes.io/projected/61f96497-68d8-4347-b831-f7bc0204c677-kube-api-access-ssqc7\") pod \"service-ca-9c57cc56f-7vkvw\" (UID: \"61f96497-68d8-4347-b831-f7bc0204c677\") " pod="openshift-service-ca/service-ca-9c57cc56f-7vkvw" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.161686 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkmbd" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.166057 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-zvq7p"] Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.172975 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pm8dm"] Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.174050 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p6shd"] Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.185982 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-7vkvw" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.187960 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4fsv\" (UniqueName: \"kubernetes.io/projected/25067bcc-8503-442b-b348-87d7e1321dbd-kube-api-access-l4fsv\") pod \"packageserver-d55dfcdfc-86dvk\" (UID: \"25067bcc-8503-442b-b348-87d7e1321dbd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.195789 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c868m\" (UniqueName: \"kubernetes.io/projected/e675e6aa-6d61-4490-b768-1dbe664d1dfe-kube-api-access-c868m\") pod \"csi-hostpathplugin-j577t\" (UID: \"e675e6aa-6d61-4490-b768-1dbe664d1dfe\") " pod="hostpath-provisioner/csi-hostpathplugin-j577t" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.204524 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwqhl\" (UniqueName: \"kubernetes.io/projected/1796695a-873c-4c15-9351-9b5bc5607830-kube-api-access-zwqhl\") pod \"machine-config-controller-84d6567774-zxv2h\" (UID: \"1796695a-873c-4c15-9351-9b5bc5607830\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zxv2h" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.205381 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lhd59" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.219339 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:58 crc kubenswrapper[4720]: E0121 14:29:58.219698 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:29:58.719671024 +0000 UTC m=+36.628410966 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.222131 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:58 crc kubenswrapper[4720]: E0121 14:29:58.222579 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:29:58.722559955 +0000 UTC m=+36.631299887 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.223671 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nscjl"] Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.240797 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7dq7\" (UniqueName: \"kubernetes.io/projected/90d203a9-910b-471c-afb5-e487b65136ac-kube-api-access-q7dq7\") pod \"marketplace-operator-79b997595-vxdw2\" (UID: \"90d203a9-910b-471c-afb5-e487b65136ac\") " pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.240996 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" Jan 21 14:29:58 crc kubenswrapper[4720]: W0121 14:29:58.241100 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a655c79_a709_4d61_8209_200b86144e8b.slice/crio-646f87b40b0d5a35de4464510145dd02118820182cc72662a1458d7500c532a5 WatchSource:0}: Error finding container 646f87b40b0d5a35de4464510145dd02118820182cc72662a1458d7500c532a5: Status 404 returned error can't find the container with id 646f87b40b0d5a35de4464510145dd02118820182cc72662a1458d7500c532a5 Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.250671 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7xsv\" (UniqueName: \"kubernetes.io/projected/92d3c944-8def-4f95-a3cb-781f929f5f28-kube-api-access-t7xsv\") pod \"multus-admission-controller-857f4d67dd-7mfnf\" (UID: \"92d3c944-8def-4f95-a3cb-781f929f5f28\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7mfnf" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.260814 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gjtkx" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.263218 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg5ng\" (UniqueName: \"kubernetes.io/projected/fccce0ee-16e1-4237-8081-a6a3c93c5851-kube-api-access-tg5ng\") pod \"olm-operator-6b444d44fb-5gg5l\" (UID: \"fccce0ee-16e1-4237-8081-a6a3c93c5851\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5gg5l" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.268238 4720 request.go:700] Waited for 1.016211237s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/serviceaccounts/cluster-image-registry-operator/token Jan 21 14:29:58 crc kubenswrapper[4720]: W0121 14:29:58.271583 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f685084_f748_4a34_9020_4d562f2a6d45.slice/crio-fe964df5a682fc1d0ca756c59c958ece1dabea08ea21b2da6e7316f8e60564ce WatchSource:0}: Error finding container fe964df5a682fc1d0ca756c59c958ece1dabea08ea21b2da6e7316f8e60564ce: Status 404 returned error can't find the container with id fe964df5a682fc1d0ca756c59c958ece1dabea08ea21b2da6e7316f8e60564ce Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.285969 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72vzw\" (UniqueName: \"kubernetes.io/projected/22087b1b-3ded-441f-8349-fb8f38809460-kube-api-access-72vzw\") pod \"cluster-image-registry-operator-dc59b4c8b-swblx\" (UID: \"22087b1b-3ded-441f-8349-fb8f38809460\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-swblx" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.300840 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxq5z" Jan 21 14:29:58 crc kubenswrapper[4720]: W0121 14:29:58.306868 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf55572f9_fbba_4efa_a6a8_94884f06f9c3.slice/crio-352be9fbf51911f89bd19de761e00fd4ded5396187c7941f699a80e469b3f65f WatchSource:0}: Error finding container 352be9fbf51911f89bd19de761e00fd4ded5396187c7941f699a80e469b3f65f: Status 404 returned error can't find the container with id 352be9fbf51911f89bd19de761e00fd4ded5396187c7941f699a80e469b3f65f Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.322111 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5c00abc0-dc46-406c-8f2f-6904ac88126d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rwv28\" (UID: \"5c00abc0-dc46-406c-8f2f-6904ac88126d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.323080 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:58 crc kubenswrapper[4720]: E0121 14:29:58.323465 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:29:58.823441714 +0000 UTC m=+36.732181646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.323538 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.323821 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dddnq\" (UniqueName: \"kubernetes.io/projected/8ac39f2f-2411-4585-b15c-c473b2fdc077-kube-api-access-dddnq\") pod \"machine-config-operator-74547568cd-n66gl\" (UID: \"8ac39f2f-2411-4585-b15c-c473b2fdc077\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl" Jan 21 14:29:58 crc kubenswrapper[4720]: E0121 14:29:58.324162 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:29:58.824150253 +0000 UTC m=+36.732890185 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.344516 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/22087b1b-3ded-441f-8349-fb8f38809460-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-swblx\" (UID: \"22087b1b-3ded-441f-8349-fb8f38809460\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-swblx" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.354100 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.361896 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cdm8\" (UniqueName: \"kubernetes.io/projected/75c0e088-7bdf-47f4-b434-b184e742d40a-kube-api-access-2cdm8\") pod \"cni-sysctl-allowlist-ds-nwj8k\" (UID: \"75c0e088-7bdf-47f4-b434-b184e742d40a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.362243 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-j577t" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.387863 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-7mfnf" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.393230 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lspkm\" (UniqueName: \"kubernetes.io/projected/03e0f458-ccd0-429e-ae37-d4c1fd2946bf-kube-api-access-lspkm\") pod \"package-server-manager-789f6589d5-8pm55\" (UID: \"03e0f458-ccd0-429e-ae37-d4c1fd2946bf\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm55" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.404023 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.416726 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dqdc\" (UniqueName: \"kubernetes.io/projected/d390eca3-a064-441f-b469-3111e626bcae-kube-api-access-4dqdc\") pod \"collect-profiles-29483415-hwxpp\" (UID: \"d390eca3-a064-441f-b469-3111e626bcae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.426400 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:58 crc kubenswrapper[4720]: E0121 14:29:58.426807 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:29:58.926786022 +0000 UTC m=+36.835525954 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.430915 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp4c9\" (UniqueName: \"kubernetes.io/projected/5c00abc0-dc46-406c-8f2f-6904ac88126d-kube-api-access-qp4c9\") pod \"ingress-operator-5b745b69d9-rwv28\" (UID: \"5c00abc0-dc46-406c-8f2f-6904ac88126d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.459927 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zxv2h" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.460371 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24clt\" (UniqueName: \"kubernetes.io/projected/a6e00143-8d6c-45fb-aa6c-44015c27a3f1-kube-api-access-24clt\") pod \"service-ca-operator-777779d784-s6zqk\" (UID: \"a6e00143-8d6c-45fb-aa6c-44015c27a3f1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-s6zqk" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.468747 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvv8z\" (UniqueName: \"kubernetes.io/projected/c3f6d778-ef18-4ad7-bd13-fb7e5983af23-kube-api-access-gvv8z\") pod \"ingress-canary-n9vh6\" (UID: \"c3f6d778-ef18-4ad7-bd13-fb7e5983af23\") " pod="openshift-ingress-canary/ingress-canary-n9vh6" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.486374 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5gg5l" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.493419 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmpqt\" (UniqueName: \"kubernetes.io/projected/48af697e-308a-4bdd-a5d8-d86cd5c4fb0c-kube-api-access-bmpqt\") pod \"control-plane-machine-set-operator-78cbb6b69f-jtj6g\" (UID: \"48af697e-308a-4bdd-a5d8-d86cd5c4fb0c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jtj6g" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.513510 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n4db\" (UniqueName: \"kubernetes.io/projected/cc0cfb45-abdd-434f-ae63-f6ae0fc7c092-kube-api-access-5n4db\") pod \"etcd-operator-b45778765-lrm9f\" (UID: \"cc0cfb45-abdd-434f-ae63-f6ae0fc7c092\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.519777 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jtj6g" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.525060 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97vbj\" (UniqueName: \"kubernetes.io/projected/4e042627-4d69-4cc5-a00d-849fe4ce76f0-kube-api-access-97vbj\") pod \"migrator-59844c95c7-d7wmg\" (UID: \"4e042627-4d69-4cc5-a00d-849fe4ce76f0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-d7wmg" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.527828 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.528750 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:58 crc kubenswrapper[4720]: E0121 14:29:58.529037 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:29:59.029025499 +0000 UTC m=+36.937765431 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.535313 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-njjgs"] Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.546930 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-s6zqk" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.554072 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm55" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.557205 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gnrtf"] Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.560466 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj5wb\" (UniqueName: \"kubernetes.io/projected/728ae7a4-9793-4555-abbb-b8a352700089-kube-api-access-lj5wb\") pod \"machine-config-server-tx54b\" (UID: \"728ae7a4-9793-4555-abbb-b8a352700089\") " pod="openshift-machine-config-operator/machine-config-server-tx54b" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.571313 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-swblx" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.588910 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.622465 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7vkvw"] Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.627534 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-d7wmg" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.630327 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:58 crc kubenswrapper[4720]: E0121 14:29:58.630445 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:29:59.130398052 +0000 UTC m=+37.039137984 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.632482 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:58 crc kubenswrapper[4720]: E0121 14:29:58.634903 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:29:59.134888697 +0000 UTC m=+37.043628629 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.647080 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-n9vh6" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.670895 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.757924 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.766205 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:58 crc kubenswrapper[4720]: E0121 14:29:58.771315 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:29:59.271288543 +0000 UTC m=+37.180028485 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.805430 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-tx54b" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.819347 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk"] Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.876947 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:58 crc kubenswrapper[4720]: E0121 14:29:58.877376 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:29:59.377354137 +0000 UTC m=+37.286094069 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.908921 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nscjl" event={"ID":"07f01852-61b7-4eee-acd6-3d8b8e2b1c85","Type":"ContainerStarted","Data":"caea177b5f0adf7c5f6e8a02e65d3f9ae7d67ff099a93964e7ab412ce9e46e6c"} Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.932088 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-v2pht" event={"ID":"afb1ffca-e30f-47cf-b399-2bd057039b10","Type":"ContainerStarted","Data":"730d32c8330d5c4334a6753d43cdbf6d8a2df14b78b053f47d3d07095fcee77d"} Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.967701 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" event={"ID":"45b6b4eb-147f-485e-96e1-5b08ee85ee9f","Type":"ContainerStarted","Data":"54ba856e17b73ebe5f3f820b898a179502c0c1d8b3de3c4e102633ebd6d04fe8"} Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.968532 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.969721 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p6shd" event={"ID":"61315eef-fa85-4828-9668-f6f4b1484453","Type":"ContainerStarted","Data":"55f075296e8423f9114226e936856791cb02be1ccdcb20ed13ad16519691ab13"} Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.970914 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"911c154ea6ac5002a0cd6e707382e586c9634aa743c1b2f50d132f6faf6b5bf0"} Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.971671 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"9d74757fc7f58cf1875d3f50e447fa64857a292eac2b4e0e9c52c2afee2bb99d"} Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.972088 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.972518 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-zvq7p" event={"ID":"9a655c79-a709-4d61-8209-200b86144e8b","Type":"ContainerStarted","Data":"646f87b40b0d5a35de4464510145dd02118820182cc72662a1458d7500c532a5"} Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.973076 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-wmxb9" event={"ID":"120bd3b2-5437-4a15-bcc4-32ae06eb7f1f","Type":"ContainerStarted","Data":"14ed6946c854a6a14009130861a7696037b95e5d52a4a6dd40a1adb4c9d59449"} Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.973529 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-5qcz5" event={"ID":"f55572f9-fbba-4efa-a6a8-94884f06f9c3","Type":"ContainerStarted","Data":"352be9fbf51911f89bd19de761e00fd4ded5396187c7941f699a80e469b3f65f"} Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.974129 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-42g76" event={"ID":"ac15d591-5558-4df9-b596-a1e27325bd6c","Type":"ContainerStarted","Data":"28165debc992515a62bbac33db73e05a5347bebc002b160765e6c1b991bcf92e"} Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.013806 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:59 crc kubenswrapper[4720]: E0121 14:29:59.015066 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:29:59.515047379 +0000 UTC m=+37.423787311 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.047319 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-68kgl" event={"ID":"aa2b643f-ce1f-45db-ba7f-31a5fc037650","Type":"ContainerStarted","Data":"3ed02c720c537b7e78dbc2a5f6f2d51e8ef65ed74b8145de14b35b2453cf5f8c"} Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.064045 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"fa78d6afe2df95f13e50a466ef3667471e4037d61eb2a4603f8ac150650e44de"} Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.064086 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"7be625850e4e33be532147f3db97270396571f8fca40f191e7ed9c2f68ac3408"} Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.090980 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkmbd"] Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.101841 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-h9ckd" event={"ID":"1a75d5de-a507-41ca-8206-eae702d16020","Type":"ContainerStarted","Data":"c7e2197c41007ce5863566b89eb987559f54f35ec9ebeedd08355a3bfb6357f3"} Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.117229 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:59 crc kubenswrapper[4720]: E0121 14:29:59.117552 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:29:59.617541173 +0000 UTC m=+37.526281105 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.218005 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:59 crc kubenswrapper[4720]: E0121 14:29:59.218221 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:29:59.718191455 +0000 UTC m=+37.626931397 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.218516 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:59 crc kubenswrapper[4720]: E0121 14:29:59.218818 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:29:59.718810972 +0000 UTC m=+37.627550904 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.230418 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-zxv2h"] Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.288763 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lhd59"] Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.319771 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:59 crc kubenswrapper[4720]: E0121 14:29:59.319914 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:29:59.819886028 +0000 UTC m=+37.728625960 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.320049 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:59 crc kubenswrapper[4720]: E0121 14:29:59.320319 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:29:59.820312139 +0000 UTC m=+37.729052071 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.421257 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.421473 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/139c8416-e015-49e4-adfe-32f9e142621f-metrics-certs\") pod \"network-metrics-daemon-x48m6\" (UID: \"139c8416-e015-49e4-adfe-32f9e142621f\") " pod="openshift-multus/network-metrics-daemon-x48m6" Jan 21 14:29:59 crc kubenswrapper[4720]: E0121 14:29:59.421898 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:29:59.921874937 +0000 UTC m=+37.830614869 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.426835 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/139c8416-e015-49e4-adfe-32f9e142621f-metrics-certs\") pod \"network-metrics-daemon-x48m6\" (UID: \"139c8416-e015-49e4-adfe-32f9e142621f\") " pod="openshift-multus/network-metrics-daemon-x48m6" Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.523062 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:59 crc kubenswrapper[4720]: E0121 14:29:59.523789 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:00.023764684 +0000 UTC m=+37.932504646 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.533746 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x48m6" Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.624020 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:59 crc kubenswrapper[4720]: E0121 14:29:59.624308 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:00.124292694 +0000 UTC m=+38.033032626 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.725765 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:59 crc kubenswrapper[4720]: E0121 14:29:59.726223 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:00.226211592 +0000 UTC m=+38.134951524 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.827128 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:59 crc kubenswrapper[4720]: E0121 14:29:59.827550 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:00.327524323 +0000 UTC m=+38.236264255 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.928226 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:59 crc kubenswrapper[4720]: E0121 14:29:59.928476 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:00.428465854 +0000 UTC m=+38.337205786 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.968677 4720 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-7xcc8 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.15:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.968728 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" podUID="45b6b4eb-147f-485e-96e1-5b08ee85ee9f" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.15:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.021779 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" podStartSLOduration=18.02176278 podStartE2EDuration="18.02176278s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:00.021414211 +0000 UTC m=+37.930154143" watchObservedRunningTime="2026-01-21 14:30:00.02176278 +0000 UTC m=+37.930502712" Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.029339 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:00 crc kubenswrapper[4720]: E0121 14:30:00.029712 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:00.529698873 +0000 UTC m=+38.438438805 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.126584 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp"] Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.130948 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p"] Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.131708 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p" Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.132883 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c48951e9-42eb-461f-812e-adc413405821-config-volume\") pod \"collect-profiles-29483430-jcp9p\" (UID: \"c48951e9-42eb-461f-812e-adc413405821\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p" Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.132909 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c48951e9-42eb-461f-812e-adc413405821-secret-volume\") pod \"collect-profiles-29483430-jcp9p\" (UID: \"c48951e9-42eb-461f-812e-adc413405821\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p" Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.132930 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rl6h\" (UniqueName: \"kubernetes.io/projected/c48951e9-42eb-461f-812e-adc413405821-kube-api-access-4rl6h\") pod \"collect-profiles-29483430-jcp9p\" (UID: \"c48951e9-42eb-461f-812e-adc413405821\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p" Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.133011 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:00 crc kubenswrapper[4720]: E0121 14:30:00.133321 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:00.633302658 +0000 UTC m=+38.542042580 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.143234 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p"] Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.233585 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:00 crc kubenswrapper[4720]: E0121 14:30:00.234181 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:00.734157147 +0000 UTC m=+38.642897079 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.234244 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c48951e9-42eb-461f-812e-adc413405821-config-volume\") pod \"collect-profiles-29483430-jcp9p\" (UID: \"c48951e9-42eb-461f-812e-adc413405821\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p" Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.234275 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c48951e9-42eb-461f-812e-adc413405821-secret-volume\") pod \"collect-profiles-29483430-jcp9p\" (UID: \"c48951e9-42eb-461f-812e-adc413405821\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p" Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.234461 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rl6h\" (UniqueName: \"kubernetes.io/projected/c48951e9-42eb-461f-812e-adc413405821-kube-api-access-4rl6h\") pod \"collect-profiles-29483430-jcp9p\" (UID: \"c48951e9-42eb-461f-812e-adc413405821\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p" Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.234535 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:00 crc kubenswrapper[4720]: E0121 14:30:00.234875 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:00.734862476 +0000 UTC m=+38.643602408 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.235267 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c48951e9-42eb-461f-812e-adc413405821-config-volume\") pod \"collect-profiles-29483430-jcp9p\" (UID: \"c48951e9-42eb-461f-812e-adc413405821\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p" Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.237411 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c48951e9-42eb-461f-812e-adc413405821-secret-volume\") pod \"collect-profiles-29483430-jcp9p\" (UID: \"c48951e9-42eb-461f-812e-adc413405821\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p" Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.290397 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rl6h\" (UniqueName: \"kubernetes.io/projected/c48951e9-42eb-461f-812e-adc413405821-kube-api-access-4rl6h\") pod \"collect-profiles-29483430-jcp9p\" (UID: \"c48951e9-42eb-461f-812e-adc413405821\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p" Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.335582 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:00 crc kubenswrapper[4720]: E0121 14:30:00.335711 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:00.835690754 +0000 UTC m=+38.744430686 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.335987 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:00 crc kubenswrapper[4720]: E0121 14:30:00.336257 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:00.836249989 +0000 UTC m=+38.744989921 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.437026 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:00 crc kubenswrapper[4720]: E0121 14:30:00.437146 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:00.937128328 +0000 UTC m=+38.845868260 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.437354 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:00 crc kubenswrapper[4720]: E0121 14:30:00.437800 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:00.937782397 +0000 UTC m=+38.846522329 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.446011 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p" Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.496971 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" event={"ID":"0f685084-f748-4a34-9020-4d562f2a6d45","Type":"ContainerStarted","Data":"fe964df5a682fc1d0ca756c59c958ece1dabea08ea21b2da6e7316f8e60564ce"} Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.534797 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gjtkx"] Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.537635 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:00 crc kubenswrapper[4720]: E0121 14:30:00.537857 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:01.037843602 +0000 UTC m=+38.946583524 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.546501 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" event={"ID":"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8","Type":"ContainerStarted","Data":"830f00cd4952a252732ae85fe73bd3c43f95902077b3e9a257094be91b79359d"} Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.549259 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" event={"ID":"90b6768c-8240-4fc1-a760-59d79a3c1c02","Type":"ContainerStarted","Data":"543f488bde8e56735d3445c774cd0398dc361a33f186699055133f8c7fa305d8"} Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.557097 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vxdw2"] Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.561774 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cnk8x" event={"ID":"b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317","Type":"ContainerStarted","Data":"f899e54eba5bb0854d1b9456c7cf01b8a5a481e9e3238929b78818f3b24c5b44"} Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.586460 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-njjgs" event={"ID":"bdfefc7f-6e59-460a-be36-220a37dd02d1","Type":"ContainerStarted","Data":"38c854ff6a882796bf5206f1b2a34d74cc62160e677f74ec8a029aa0fa14c832"} Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.613984 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gnrtf" event={"ID":"d4105ff8-7f4b-4dc9-8ceb-3d136236e8fb","Type":"ContainerStarted","Data":"59913a4d0ff59fcd0eab8b556446aa66c2becc4ceb9ccb98fbf6b567d3915574"} Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.642321 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:00 crc kubenswrapper[4720]: E0121 14:30:00.642570 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:01.142559489 +0000 UTC m=+39.051299421 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.745141 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:00 crc kubenswrapper[4720]: E0121 14:30:00.745353 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:01.245338662 +0000 UTC m=+39.154078594 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.789231 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-92xp4" event={"ID":"4a47e9b4-6318-4f71-9db0-105be2ada134","Type":"ContainerStarted","Data":"2d753d0c2455fc03b4f3a72d390e6307ba52fba190bd7b98c5a610d05b2c4ee5"} Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.798282 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mxvkb" event={"ID":"aa4e660f-7816-4c20-b94c-5f9543d9cbed","Type":"ContainerStarted","Data":"985ce529eb1d976cd0ffd92077ff301852642b7b995a148d25e1219249de4740"} Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.799390 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"93344ee17cf14cce453ccb518314043a33398185bb204146ef9bdfb7b437dacc"} Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.801866 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" event={"ID":"03eab9ba-e390-43a8-ab91-b8f0fe8678a0","Type":"ContainerStarted","Data":"566a6d1a5d22263d021363b64e42596b81bd2b7c56316df45c97fc2d4dc05071"} Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.804331 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.834786 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" podStartSLOduration=18.834772349 podStartE2EDuration="18.834772349s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:00.833718 +0000 UTC m=+38.742457942" watchObservedRunningTime="2026-01-21 14:30:00.834772349 +0000 UTC m=+38.743512281" Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.845137 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.848568 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:00 crc kubenswrapper[4720]: E0121 14:30:00.849215 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:01.349195844 +0000 UTC m=+39.257935776 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.955330 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:00 crc kubenswrapper[4720]: E0121 14:30:00.955573 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:01.455555767 +0000 UTC m=+39.364295699 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:01 crc kubenswrapper[4720]: I0121 14:30:01.056849 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:01 crc kubenswrapper[4720]: E0121 14:30:01.057170 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:01.557158556 +0000 UTC m=+39.465898488 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:01 crc kubenswrapper[4720]: I0121 14:30:01.157451 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:01 crc kubenswrapper[4720]: E0121 14:30:01.158118 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:01.658096496 +0000 UTC m=+39.566836428 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:01 crc kubenswrapper[4720]: I0121 14:30:01.260326 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:01 crc kubenswrapper[4720]: E0121 14:30:01.260614 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:01.760602592 +0000 UTC m=+39.669342524 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:01 crc kubenswrapper[4720]: I0121 14:30:01.271592 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxq5z"] Jan 21 14:30:01 crc kubenswrapper[4720]: I0121 14:30:01.362762 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:01 crc kubenswrapper[4720]: E0121 14:30:01.362985 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:01.862957362 +0000 UTC m=+39.771697294 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:01 crc kubenswrapper[4720]: I0121 14:30:01.363179 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:01 crc kubenswrapper[4720]: E0121 14:30:01.363887 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:01.863878947 +0000 UTC m=+39.772618879 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:01 crc kubenswrapper[4720]: I0121 14:30:01.466616 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:01 crc kubenswrapper[4720]: E0121 14:30:01.466790 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:01.966761043 +0000 UTC m=+39.875500985 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:01 crc kubenswrapper[4720]: I0121 14:30:01.468346 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:01 crc kubenswrapper[4720]: E0121 14:30:01.470545 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:01.970521928 +0000 UTC m=+39.879261860 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:01 crc kubenswrapper[4720]: I0121 14:30:01.521311 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-j577t"] Jan 21 14:30:01 crc kubenswrapper[4720]: I0121 14:30:01.524198 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7mfnf"] Jan 21 14:30:01 crc kubenswrapper[4720]: I0121 14:30:01.569497 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:01 crc kubenswrapper[4720]: E0121 14:30:01.569645 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:02.069624087 +0000 UTC m=+39.978364019 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:01 crc kubenswrapper[4720]: I0121 14:30:01.570001 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:01 crc kubenswrapper[4720]: E0121 14:30:01.570347 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:02.070333437 +0000 UTC m=+39.979073369 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:01 crc kubenswrapper[4720]: I0121 14:30:01.613583 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm55"] Jan 21 14:30:01 crc kubenswrapper[4720]: W0121 14:30:01.662482 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd1cfb10_4405_4ab9_8631_690622069d01.slice/crio-74313acf4962ad44eef002ad9abf344b704e04361c558c8c43da2b2b4c2491b2 WatchSource:0}: Error finding container 74313acf4962ad44eef002ad9abf344b704e04361c558c8c43da2b2b4c2491b2: Status 404 returned error can't find the container with id 74313acf4962ad44eef002ad9abf344b704e04361c558c8c43da2b2b4c2491b2 Jan 21 14:30:01 crc kubenswrapper[4720]: I0121 14:30:01.670501 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:01 crc kubenswrapper[4720]: E0121 14:30:01.670916 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:02.170898218 +0000 UTC m=+40.079638150 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:01 crc kubenswrapper[4720]: I0121 14:30:01.773724 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:01 crc kubenswrapper[4720]: E0121 14:30:01.774075 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:02.27405913 +0000 UTC m=+40.182799062 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:01 crc kubenswrapper[4720]: W0121 14:30:01.842859 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92d3c944_8def_4f95_a3cb_781f929f5f28.slice/crio-97fd1f677765ee21b274241e0156758e29a09c3cf56d15597c9447e41813dab4 WatchSource:0}: Error finding container 97fd1f677765ee21b274241e0156758e29a09c3cf56d15597c9447e41813dab4: Status 404 returned error can't find the container with id 97fd1f677765ee21b274241e0156758e29a09c3cf56d15597c9447e41813dab4 Jan 21 14:30:01 crc kubenswrapper[4720]: I0121 14:30:01.874052 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl"] Jan 21 14:30:01 crc kubenswrapper[4720]: I0121 14:30:01.875162 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:01 crc kubenswrapper[4720]: E0121 14:30:01.875410 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:02.375393082 +0000 UTC m=+40.284133014 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:01 crc kubenswrapper[4720]: I0121 14:30:01.941311 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zxv2h" event={"ID":"1796695a-873c-4c15-9351-9b5bc5607830","Type":"ContainerStarted","Data":"97e218b3b126bf9534123d7899f10f559637b9f620034ec27917fa080f36a0bd"} Jan 21 14:30:01 crc kubenswrapper[4720]: I0121 14:30:01.969922 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mxvkb" event={"ID":"aa4e660f-7816-4c20-b94c-5f9543d9cbed","Type":"ContainerStarted","Data":"8eb7645d76fc197f4b6ce9d91e9c31dfae161f32772e83b30ac5430eeabd4a4a"} Jan 21 14:30:01 crc kubenswrapper[4720]: I0121 14:30:01.979920 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:01 crc kubenswrapper[4720]: E0121 14:30:01.980197 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:02.480183971 +0000 UTC m=+40.388923903 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.011821 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" event={"ID":"90d203a9-910b-471c-afb5-e487b65136ac","Type":"ContainerStarted","Data":"617f70e18e4e0f9b72a22ff92ce1fc94aae99827e9d16ba9cde606ce5a9e499c"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.012465 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-d7wmg"] Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.041527 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-68kgl" event={"ID":"aa2b643f-ce1f-45db-ba7f-31a5fc037650","Type":"ContainerStarted","Data":"b1a4cab934777cf770ad7290079c5da558f7333410699ddf844c2a8c64c585b5"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.042475 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-68kgl" Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.063150 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkmbd" event={"ID":"ac33402e-edb9-41ab-bb76-b17108b5ea0d","Type":"ContainerStarted","Data":"ced6fef271fbd98e2b6b3396d759018065bba984c864f66fa3e3d7be0d03573a"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.080323 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:02 crc kubenswrapper[4720]: E0121 14:30:02.081599 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:02.581583154 +0000 UTC m=+40.490323086 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.091423 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-h9ckd" event={"ID":"1a75d5de-a507-41ca-8206-eae702d16020","Type":"ContainerStarted","Data":"e5a5cec21014d0688ed0467706fadd3d5b85174d4bb148ee1c3623b6fe9f43a2"} Jan 21 14:30:02 crc kubenswrapper[4720]: W0121 14:30:02.094306 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ac39f2f_2411_4585_b15c_c473b2fdc077.slice/crio-6644131028ff0934bd5eebe9d63a3543b23a3cb63a40555fbbeef36e01c4d63c WatchSource:0}: Error finding container 6644131028ff0934bd5eebe9d63a3543b23a3cb63a40555fbbeef36e01c4d63c: Status 404 returned error can't find the container with id 6644131028ff0934bd5eebe9d63a3543b23a3cb63a40555fbbeef36e01c4d63c Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.118588 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nscjl" event={"ID":"07f01852-61b7-4eee-acd6-3d8b8e2b1c85","Type":"ContainerStarted","Data":"c0fbadd1246427f81683f1be63e303c0ac271a63273c1af4367d0f668557c8f5"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.142418 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-zvq7p" event={"ID":"9a655c79-a709-4d61-8209-200b86144e8b","Type":"ContainerStarted","Data":"4c60f97e493a2e76e74de04c15dfa162453b75bb780736505c177f53ef30ab92"} Jan 21 14:30:02 crc kubenswrapper[4720]: W0121 14:30:02.151231 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e042627_4d69_4cc5_a00d_849fe4ce76f0.slice/crio-ecdf77c66e058e10226dc0eef1e41ffb5f42ce9d32b1c321d11e17c2b0482857 WatchSource:0}: Error finding container ecdf77c66e058e10226dc0eef1e41ffb5f42ce9d32b1c321d11e17c2b0482857: Status 404 returned error can't find the container with id ecdf77c66e058e10226dc0eef1e41ffb5f42ce9d32b1c321d11e17c2b0482857 Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.155163 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p6shd" event={"ID":"61315eef-fa85-4828-9668-f6f4b1484453","Type":"ContainerStarted","Data":"4e25f17cddec9b580bbcb602930a790de59b6ea31c39e52a64dc1e8ad2d3faa2"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.157994 4720 generic.go:334] "Generic (PLEG): container finished" podID="afb1ffca-e30f-47cf-b399-2bd057039b10" containerID="abf435badffbc5ced9e3437c64cfcc64798e18b35727d62a3c9141855d35cd94" exitCode=0 Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.158077 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-v2pht" event={"ID":"afb1ffca-e30f-47cf-b399-2bd057039b10","Type":"ContainerDied","Data":"abf435badffbc5ced9e3437c64cfcc64798e18b35727d62a3c9141855d35cd94"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.201547 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:02 crc kubenswrapper[4720]: E0121 14:30:02.202107 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:02.702091114 +0000 UTC m=+40.610831046 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.214167 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-92xp4" event={"ID":"4a47e9b4-6318-4f71-9db0-105be2ada134","Type":"ContainerStarted","Data":"3ef40d035796e973c9c22661f757ce13326369cced21ba41ca9890b9313d2d47"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.232419 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" event={"ID":"90b6768c-8240-4fc1-a760-59d79a3c1c02","Type":"ContainerDied","Data":"6916309afb8a14b18f202b1fd06253cdb2b2c4bbbebd08448656d331945a6344"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.232581 4720 generic.go:334] "Generic (PLEG): container finished" podID="90b6768c-8240-4fc1-a760-59d79a3c1c02" containerID="6916309afb8a14b18f202b1fd06253cdb2b2c4bbbebd08448656d331945a6344" exitCode=0 Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.237270 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lhd59" event={"ID":"139e5d2d-e00c-4dc9-a6d9-2edc4a09e5e6","Type":"ContainerStarted","Data":"b3b04114286f227c680a1ea5cf6cd19487e347d3c26ab08256886d11a48843d9"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.277862 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nscjl" podStartSLOduration=20.277838248 podStartE2EDuration="20.277838248s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:02.236414027 +0000 UTC m=+40.145153959" watchObservedRunningTime="2026-01-21 14:30:02.277838248 +0000 UTC m=+40.186578190" Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.278580 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-68kgl" podStartSLOduration=21.278573699 podStartE2EDuration="21.278573699s" podCreationTimestamp="2026-01-21 14:29:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:02.19985318 +0000 UTC m=+40.108593122" watchObservedRunningTime="2026-01-21 14:30:02.278573699 +0000 UTC m=+40.187313631" Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.303595 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-42g76" event={"ID":"ac15d591-5558-4df9-b596-a1e27325bd6c","Type":"ContainerStarted","Data":"d017a8fc53b7b34bb07b21c6bfe71cb99f83c209c27a0147afc5ce68886d64e1"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.307804 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:02 crc kubenswrapper[4720]: E0121 14:30:02.308706 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:02.808690633 +0000 UTC m=+40.717430565 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.326295 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cnk8x" event={"ID":"b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317","Type":"ContainerStarted","Data":"6e84c6d90cb593eb086278fd37d11ee3fed453a2060bb4188e4ef4fb7a4d218f"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.341847 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-h9ckd" podStartSLOduration=20.341829252 podStartE2EDuration="20.341829252s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:02.333810467 +0000 UTC m=+40.242550409" watchObservedRunningTime="2026-01-21 14:30:02.341829252 +0000 UTC m=+40.250569184" Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.390046 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-swblx"] Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.397723 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p6shd" podStartSLOduration=20.397699989 podStartE2EDuration="20.397699989s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:02.387696039 +0000 UTC m=+40.296435981" watchObservedRunningTime="2026-01-21 14:30:02.397699989 +0000 UTC m=+40.306439921" Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.408998 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:02 crc kubenswrapper[4720]: E0121 14:30:02.409387 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:02.909373146 +0000 UTC m=+40.818113078 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.432591 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-lrm9f"] Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.456556 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp"] Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.482219 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" event={"ID":"25067bcc-8503-442b-b348-87d7e1321dbd","Type":"ContainerStarted","Data":"fe3c5854542ee296733c003d876d747f7a26a5fd7fa183542d37f1a656b67cec"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.483207 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.509583 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:02 crc kubenswrapper[4720]: E0121 14:30:02.511085 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:03.011065198 +0000 UTC m=+40.919805130 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.524780 4720 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-86dvk container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:5443/healthz\": dial tcp 10.217.0.42:5443: connect: connection refused" start-of-body= Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.527342 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" podUID="25067bcc-8503-442b-b348-87d7e1321dbd" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.42:5443/healthz\": dial tcp 10.217.0.42:5443: connect: connection refused" Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.554049 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"b43777fe6ea26bdc85574cfeba6c1859f9374dabe02fcf1d36c70c6f1335d99b"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.584389 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-92xp4" podStartSLOduration=20.584372104 podStartE2EDuration="20.584372104s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:02.539863705 +0000 UTC m=+40.448603677" watchObservedRunningTime="2026-01-21 14:30:02.584372104 +0000 UTC m=+40.493112036" Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.585472 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm55" event={"ID":"03e0f458-ccd0-429e-ae37-d4c1fd2946bf","Type":"ContainerStarted","Data":"5a8050bcef0cf8fb732b68561f00f528026eaa34ede030eddcf4dcd5080b6027"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.603384 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gjtkx" event={"ID":"29766114-9e0b-4064-8010-8f426935f834","Type":"ContainerStarted","Data":"6e6cbcd76291b87b152f21b35c336619a38f7f472b65b95b758fff01b973265d"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.604762 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-7vkvw" event={"ID":"61f96497-68d8-4347-b831-f7bc0204c677","Type":"ContainerStarted","Data":"5d848b34325d188a6dff5cc5277de22b8b084fa3ca1f2ceb5d3804f9cd4823f9"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.612630 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:02 crc kubenswrapper[4720]: E0121 14:30:02.613995 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:03.113977964 +0000 UTC m=+41.022717896 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.625935 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" event={"ID":"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8","Type":"ContainerStarted","Data":"aa3faf8a1f06699eadc416f2c819035756683b2951639a26a35299fd3eb1eeb4"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.626363 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.669003 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxq5z" event={"ID":"fd1cfb10-4405-4ab9-8631-690622069d01","Type":"ContainerStarted","Data":"74313acf4962ad44eef002ad9abf344b704e04361c558c8c43da2b2b4c2491b2"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.718104 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:02 crc kubenswrapper[4720]: E0121 14:30:02.718196 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:03.218178477 +0000 UTC m=+41.126918419 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.718784 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:02 crc kubenswrapper[4720]: E0121 14:30:02.719114 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:03.219101762 +0000 UTC m=+41.127841694 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.724301 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-tx54b" event={"ID":"728ae7a4-9793-4555-abbb-b8a352700089","Type":"ContainerStarted","Data":"635b5fa44bf914969c674a4d54bb1c50eb6e0f9d246af8687dcc8add89711c07"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.781945 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-wmxb9" event={"ID":"120bd3b2-5437-4a15-bcc4-32ae06eb7f1f","Type":"ContainerStarted","Data":"c027252a8c92ff92166d093045af10483648801b002abfe63a08aa857ac45d75"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.781994 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-wmxb9" Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.809801 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" event={"ID":"75c0e088-7bdf-47f4-b434-b184e742d40a","Type":"ContainerStarted","Data":"606f33407deb43968f7cc7f66c83d922a2e45672a6ac0cad952ee6a566842321"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.811118 4720 patch_prober.go:28] interesting pod/downloads-7954f5f757-wmxb9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.811179 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wmxb9" podUID="120bd3b2-5437-4a15-bcc4-32ae06eb7f1f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.812919 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-42g76" podStartSLOduration=21.812904812 podStartE2EDuration="21.812904812s" podCreationTimestamp="2026-01-21 14:29:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:02.794426604 +0000 UTC m=+40.703166546" watchObservedRunningTime="2026-01-21 14:30:02.812904812 +0000 UTC m=+40.721644744" Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.813272 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cnk8x" podStartSLOduration=21.813265223 podStartE2EDuration="21.813265223s" podCreationTimestamp="2026-01-21 14:29:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:02.635061115 +0000 UTC m=+40.543801047" watchObservedRunningTime="2026-01-21 14:30:02.813265223 +0000 UTC m=+40.722005155" Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.827766 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:02 crc kubenswrapper[4720]: E0121 14:30:02.828189 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:03.328170221 +0000 UTC m=+41.236910163 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.862737 4720 generic.go:334] "Generic (PLEG): container finished" podID="0f685084-f748-4a34-9020-4d562f2a6d45" containerID="2d7b0571d83db10d3caafeac10de4c427dd5874863779648edb433b1cbbca003" exitCode=0 Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.862828 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" event={"ID":"0f685084-f748-4a34-9020-4d562f2a6d45","Type":"ContainerDied","Data":"2d7b0571d83db10d3caafeac10de4c427dd5874863779648edb433b1cbbca003"} Jan 21 14:30:02 crc kubenswrapper[4720]: W0121 14:30:02.897397 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd390eca3_a064_441f_b469_3111e626bcae.slice/crio-20bb4a1f28ddb483420180b50ee58dffb3342280cd3e1eb7cd43cf194a8782bb WatchSource:0}: Error finding container 20bb4a1f28ddb483420180b50ee58dffb3342280cd3e1eb7cd43cf194a8782bb: Status 404 returned error can't find the container with id 20bb4a1f28ddb483420180b50ee58dffb3342280cd3e1eb7cd43cf194a8782bb Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.899010 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-5qcz5" event={"ID":"f55572f9-fbba-4efa-a6a8-94884f06f9c3","Type":"ContainerStarted","Data":"9e3871719184587eb6610bac36b8397f0e697272b42dcf037846d27b27051e59"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.934294 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:02 crc kubenswrapper[4720]: E0121 14:30:02.935970 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:03.435956633 +0000 UTC m=+41.344696575 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.000851 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-j577t" event={"ID":"e675e6aa-6d61-4490-b768-1dbe664d1dfe","Type":"ContainerStarted","Data":"9fef56ce8894e77f9bfde39511b1715f349de7b85e40b799dde370407d27e676"} Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.066069 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.078223 4720 patch_prober.go:28] interesting pod/console-operator-58897d9998-68kgl container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.078282 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-68kgl" podUID="aa2b643f-ce1f-45db-ba7f-31a5fc037650" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.094937 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:03 crc kubenswrapper[4720]: E0121 14:30:03.113337 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:03.613310237 +0000 UTC m=+41.522050179 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.126353 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.142452 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:03 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:03 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:03 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.142498 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.219737 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:03 crc kubenswrapper[4720]: E0121 14:30:03.221101 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:03.7210892 +0000 UTC m=+41.629829132 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.277567 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.325646 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:03 crc kubenswrapper[4720]: E0121 14:30:03.325920 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:03.825903049 +0000 UTC m=+41.734642981 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.427635 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:03 crc kubenswrapper[4720]: E0121 14:30:03.428124 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:03.928110465 +0000 UTC m=+41.836850397 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.535517 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:03 crc kubenswrapper[4720]: E0121 14:30:03.535595 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:04.035578738 +0000 UTC m=+41.944318670 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.536045 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:03 crc kubenswrapper[4720]: E0121 14:30:03.536311 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:04.036304159 +0000 UTC m=+41.945044091 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.595100 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" podStartSLOduration=21.595078057 podStartE2EDuration="21.595078057s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:03.593807892 +0000 UTC m=+41.502547854" watchObservedRunningTime="2026-01-21 14:30:03.595078057 +0000 UTC m=+41.503817989" Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.636846 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:03 crc kubenswrapper[4720]: E0121 14:30:03.637382 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:04.137366253 +0000 UTC m=+42.046106185 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.693411 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" podStartSLOduration=21.693389154 podStartE2EDuration="21.693389154s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:03.68502557 +0000 UTC m=+41.593765502" watchObservedRunningTime="2026-01-21 14:30:03.693389154 +0000 UTC m=+41.602129086" Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.739946 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:03 crc kubenswrapper[4720]: E0121 14:30:03.740197 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:04.240186317 +0000 UTC m=+42.148926249 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.842278 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:03 crc kubenswrapper[4720]: E0121 14:30:03.842583 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:04.342563668 +0000 UTC m=+42.251303600 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.958694 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:03 crc kubenswrapper[4720]: E0121 14:30:03.959191 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:04.459180518 +0000 UTC m=+42.367920450 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.990342 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-tx54b" podStartSLOduration=8.990328041 podStartE2EDuration="8.990328041s" podCreationTimestamp="2026-01-21 14:29:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:03.988318705 +0000 UTC m=+41.897058637" watchObservedRunningTime="2026-01-21 14:30:03.990328041 +0000 UTC m=+41.899067973" Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.991610 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-wmxb9" podStartSLOduration=22.991592897 podStartE2EDuration="22.991592897s" podCreationTimestamp="2026-01-21 14:29:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:03.869402031 +0000 UTC m=+41.778141983" watchObservedRunningTime="2026-01-21 14:30:03.991592897 +0000 UTC m=+41.900332829" Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.060761 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:04 crc kubenswrapper[4720]: E0121 14:30:04.061046 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:04.561030954 +0000 UTC m=+42.469770876 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.069242 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-x48m6"] Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.082844 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5gg5l"] Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.098124 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jtj6g"] Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.122761 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p"] Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.140073 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:04 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:04 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:04 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.140152 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.145199 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-5qcz5" podStartSLOduration=22.145182294 podStartE2EDuration="22.145182294s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:04.140956396 +0000 UTC m=+42.049696328" watchObservedRunningTime="2026-01-21 14:30:04.145182294 +0000 UTC m=+42.053922226" Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.177905 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-d7wmg" event={"ID":"4e042627-4d69-4cc5-a00d-849fe4ce76f0","Type":"ContainerStarted","Data":"ecdf77c66e058e10226dc0eef1e41ffb5f42ce9d32b1c321d11e17c2b0482857"} Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.178411 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:04 crc kubenswrapper[4720]: E0121 14:30:04.178712 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:04.678700784 +0000 UTC m=+42.587440716 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.182454 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-n9vh6"] Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.279507 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:04 crc kubenswrapper[4720]: E0121 14:30:04.279929 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:04.779896252 +0000 UTC m=+42.688636184 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.279979 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:04 crc kubenswrapper[4720]: E0121 14:30:04.280393 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:04.780385595 +0000 UTC m=+42.689125527 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.338099 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28"] Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.366303 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-swblx" event={"ID":"22087b1b-3ded-441f-8349-fb8f38809460","Type":"ContainerStarted","Data":"a4002e248d870a6134914a3582a580238184afda7f20d29a8c18d9b22c08bbd4"} Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.394105 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-s6zqk"] Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.395301 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:04 crc kubenswrapper[4720]: E0121 14:30:04.395716 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:04.89569633 +0000 UTC m=+42.804436262 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.415998 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7mfnf" event={"ID":"92d3c944-8def-4f95-a3cb-781f929f5f28","Type":"ContainerStarted","Data":"97fd1f677765ee21b274241e0156758e29a09c3cf56d15597c9447e41813dab4"} Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.469915 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-7vkvw" event={"ID":"61f96497-68d8-4347-b831-f7bc0204c677","Type":"ContainerStarted","Data":"1c66c0276821602c5edbec1cb152195002103444752fa96b7a64e6c92b173f44"} Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.495216 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm55" event={"ID":"03e0f458-ccd0-429e-ae37-d4c1fd2946bf","Type":"ContainerStarted","Data":"0a4b393749e8bb158083a687b0e28dc40131fb09d24ec98c35de9bfd981c5f0f"} Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.499289 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:04 crc kubenswrapper[4720]: E0121 14:30:04.499632 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:04.999612264 +0000 UTC m=+42.908352196 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:04 crc kubenswrapper[4720]: W0121 14:30:04.523682 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6e00143_8d6c_45fb_aa6c_44015c27a3f1.slice/crio-d9c7637e04f8708eda047091f131189948f6f5c5b271d1b24e156faa2ee2aebe WatchSource:0}: Error finding container d9c7637e04f8708eda047091f131189948f6f5c5b271d1b24e156faa2ee2aebe: Status 404 returned error can't find the container with id d9c7637e04f8708eda047091f131189948f6f5c5b271d1b24e156faa2ee2aebe Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.537282 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-7vkvw" podStartSLOduration=22.537260359 podStartE2EDuration="22.537260359s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:04.535302534 +0000 UTC m=+42.444042466" watchObservedRunningTime="2026-01-21 14:30:04.537260359 +0000 UTC m=+42.446000291" Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.550149 4720 csr.go:261] certificate signing request csr-f6lkn is approved, waiting to be issued Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.583854 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-njjgs" event={"ID":"bdfefc7f-6e59-460a-be36-220a37dd02d1","Type":"ContainerStarted","Data":"3a60f3401a971822a6c3c5e1c76e45f5b43f1881ba3b0903071ff46ed3267b0f"} Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.597044 4720 csr.go:257] certificate signing request csr-f6lkn is issued Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.601199 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:04 crc kubenswrapper[4720]: E0121 14:30:04.601343 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:05.101318636 +0000 UTC m=+43.010058568 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.601372 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:04 crc kubenswrapper[4720]: E0121 14:30:04.601699 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:05.101687456 +0000 UTC m=+43.010427388 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.617095 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lhd59" event={"ID":"139e5d2d-e00c-4dc9-a6d9-2edc4a09e5e6","Type":"ContainerStarted","Data":"bf2039fd70760d288df96f1522264b3b31e14233f6babd676e9244f9ca4f09f1"} Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.631502 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gjtkx" event={"ID":"29766114-9e0b-4064-8010-8f426935f834","Type":"ContainerStarted","Data":"dfefb22453d85c58abe579cd329abb20490bd945e049784272f03cd2584dfb08"} Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.647207 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkmbd" event={"ID":"ac33402e-edb9-41ab-bb76-b17108b5ea0d","Type":"ContainerStarted","Data":"46d5892660319f84ea1e8cd800ced58253dd8f8caf952b347dc69db7674d69c5"} Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.648374 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkmbd" Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.666580 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkmbd" Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.702420 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:04 crc kubenswrapper[4720]: E0121 14:30:04.702867 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:05.202845013 +0000 UTC m=+43.111584945 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.703074 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:04 crc kubenswrapper[4720]: E0121 14:30:04.711554 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:05.211538966 +0000 UTC m=+43.120278898 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.718818 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mxvkb" event={"ID":"aa4e660f-7816-4c20-b94c-5f9543d9cbed","Type":"ContainerStarted","Data":"67f9df6a7e20f615514512c6c244bd455467fbc7e5a228f1cf92ec6c3864a1a5"} Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.732073 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gnrtf" event={"ID":"d4105ff8-7f4b-4dc9-8ceb-3d136236e8fb","Type":"ContainerStarted","Data":"3d70726aeba065ed4519327507bc91405caf8af69dc543f70496090a17ba27cf"} Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.754511 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lhd59" podStartSLOduration=22.754481861 podStartE2EDuration="22.754481861s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:04.697247996 +0000 UTC m=+42.605987958" watchObservedRunningTime="2026-01-21 14:30:04.754481861 +0000 UTC m=+42.663221803" Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.765582 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-tx54b" event={"ID":"728ae7a4-9793-4555-abbb-b8a352700089","Type":"ContainerStarted","Data":"eef27d30b0469b1e3c32696913cf322af70812c4540f276b721a15ab9fb08687"} Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.810906 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zxv2h" event={"ID":"1796695a-873c-4c15-9351-9b5bc5607830","Type":"ContainerStarted","Data":"b911f58a7cbfdcc89c70d4e190476bc6c151db9ab249fad967fcfa77f4c4b7c8"} Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.814280 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:04 crc kubenswrapper[4720]: E0121 14:30:04.815338 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:05.315319637 +0000 UTC m=+43.224059579 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.850693 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gjtkx" podStartSLOduration=22.850676989 podStartE2EDuration="22.850676989s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:04.848938849 +0000 UTC m=+42.757678781" watchObservedRunningTime="2026-01-21 14:30:04.850676989 +0000 UTC m=+42.759416931" Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.880791 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" event={"ID":"90d203a9-910b-471c-afb5-e487b65136ac","Type":"ContainerStarted","Data":"d05d82d82af0a06d610676d66e7cb7ca7f7be3c13b938925658fdac6c7476705"} Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.882417 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.882490 4720 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-vxdw2 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.882514 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" podUID="90d203a9-910b-471c-afb5-e487b65136ac" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.925161 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:04 crc kubenswrapper[4720]: E0121 14:30:04.926211 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:05.426201127 +0000 UTC m=+43.334941059 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.955966 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" event={"ID":"25067bcc-8503-442b-b348-87d7e1321dbd","Type":"ContainerStarted","Data":"a86027c8a1d6f97abdd1ac248e6a6795566fdbfdc287504cfc91affcaede3c41"} Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.969941 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mxvkb" podStartSLOduration=23.969921823 podStartE2EDuration="23.969921823s" podCreationTimestamp="2026-01-21 14:29:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:04.968064461 +0000 UTC m=+42.876804413" watchObservedRunningTime="2026-01-21 14:30:04.969921823 +0000 UTC m=+42.878661775" Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.970317 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkmbd" podStartSLOduration=22.970309693 podStartE2EDuration="22.970309693s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:04.930784055 +0000 UTC m=+42.839523997" watchObservedRunningTime="2026-01-21 14:30:04.970309693 +0000 UTC m=+42.879049625" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.004309 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.004360 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" event={"ID":"cc0cfb45-abdd-434f-ae63-f6ae0fc7c092","Type":"ContainerStarted","Data":"e226ecb176783d4e83f6dbdf832c07112a9be1248aca29842d32bb46e309cafe"} Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.004505 4720 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.008951 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gnrtf" podStartSLOduration=23.008932107 podStartE2EDuration="23.008932107s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:05.004706698 +0000 UTC m=+42.913446650" watchObservedRunningTime="2026-01-21 14:30:05.008932107 +0000 UTC m=+42.917672049" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.027488 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:05 crc kubenswrapper[4720]: E0121 14:30:05.028200 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:05.528183526 +0000 UTC m=+43.436923458 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.036433 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-v2pht" event={"ID":"afb1ffca-e30f-47cf-b399-2bd057039b10","Type":"ContainerStarted","Data":"2887016d2bf00c16b4c00748b47e857869446bf465448174bb20e4e6131f872f"} Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.037293 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-v2pht" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.044002 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl" event={"ID":"8ac39f2f-2411-4585-b15c-c473b2fdc077","Type":"ContainerStarted","Data":"6c1742221c6fc2ea79651daddda07dafe086f710ffb076a28589d9c2168776a2"} Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.044040 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl" event={"ID":"8ac39f2f-2411-4585-b15c-c473b2fdc077","Type":"ContainerStarted","Data":"6644131028ff0934bd5eebe9d63a3543b23a3cb63a40555fbbeef36e01c4d63c"} Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.047265 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp" podUID="d390eca3-a064-441f-b469-3111e626bcae" containerName="collect-profiles" containerID="cri-o://a8aefdfc6a0ce58c176fdd5bc916c454bc92f7fb8066e8d2403bc8df6646d409" gracePeriod=30 Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.047538 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp" event={"ID":"d390eca3-a064-441f-b469-3111e626bcae","Type":"ContainerStarted","Data":"20bb4a1f28ddb483420180b50ee58dffb3342280cd3e1eb7cd43cf194a8782bb"} Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.048881 4720 patch_prober.go:28] interesting pod/downloads-7954f5f757-wmxb9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.048948 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wmxb9" podUID="120bd3b2-5437-4a15-bcc4-32ae06eb7f1f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.063494 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-68kgl" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.081148 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-v2pht" podStartSLOduration=24.081125531 podStartE2EDuration="24.081125531s" podCreationTimestamp="2026-01-21 14:29:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:05.078261791 +0000 UTC m=+42.987001733" watchObservedRunningTime="2026-01-21 14:30:05.081125531 +0000 UTC m=+42.989865463" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.081786 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" podStartSLOduration=23.08178082 podStartE2EDuration="23.08178082s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:05.041330325 +0000 UTC m=+42.950070257" watchObservedRunningTime="2026-01-21 14:30:05.08178082 +0000 UTC m=+42.990520752" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.128519 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:05 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:05 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:05 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.128572 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.132722 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:05 crc kubenswrapper[4720]: E0121 14:30:05.139798 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:05.639780866 +0000 UTC m=+43.548520798 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.234134 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:05 crc kubenswrapper[4720]: E0121 14:30:05.234454 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:05.734439131 +0000 UTC m=+43.643179063 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.271379 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.329126 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp" podStartSLOduration=24.329108415 podStartE2EDuration="24.329108415s" podCreationTimestamp="2026-01-21 14:29:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:05.187239097 +0000 UTC m=+43.095979029" watchObservedRunningTime="2026-01-21 14:30:05.329108415 +0000 UTC m=+43.237848347" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.336367 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:05 crc kubenswrapper[4720]: E0121 14:30:05.336753 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:05.836738949 +0000 UTC m=+43.745478881 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.437459 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:05 crc kubenswrapper[4720]: E0121 14:30:05.437861 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:05.937835364 +0000 UTC m=+43.846575296 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.441857 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-v6vwc"] Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.442788 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v6vwc" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.446012 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.538842 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d6131a5-b63e-42a5-905a-9ed5350a421a-catalog-content\") pod \"certified-operators-v6vwc\" (UID: \"1d6131a5-b63e-42a5-905a-9ed5350a421a\") " pod="openshift-marketplace/certified-operators-v6vwc" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.538937 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d6131a5-b63e-42a5-905a-9ed5350a421a-utilities\") pod \"certified-operators-v6vwc\" (UID: \"1d6131a5-b63e-42a5-905a-9ed5350a421a\") " pod="openshift-marketplace/certified-operators-v6vwc" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.538987 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmfz4\" (UniqueName: \"kubernetes.io/projected/1d6131a5-b63e-42a5-905a-9ed5350a421a-kube-api-access-dmfz4\") pod \"certified-operators-v6vwc\" (UID: \"1d6131a5-b63e-42a5-905a-9ed5350a421a\") " pod="openshift-marketplace/certified-operators-v6vwc" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.539044 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:05 crc kubenswrapper[4720]: E0121 14:30:05.539482 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:06.039464095 +0000 UTC m=+43.948204027 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.600402 4720 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-01-21 14:25:04 +0000 UTC, rotation deadline is 2026-10-20 09:57:01.438317231 +0000 UTC Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.600645 4720 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6523h26m55.837674681s for next certificate rotation Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.634966 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5qbdf"] Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.636079 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5qbdf" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.637890 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.640388 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.640555 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d6131a5-b63e-42a5-905a-9ed5350a421a-utilities\") pod \"certified-operators-v6vwc\" (UID: \"1d6131a5-b63e-42a5-905a-9ed5350a421a\") " pod="openshift-marketplace/certified-operators-v6vwc" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.640587 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmfz4\" (UniqueName: \"kubernetes.io/projected/1d6131a5-b63e-42a5-905a-9ed5350a421a-kube-api-access-dmfz4\") pod \"certified-operators-v6vwc\" (UID: \"1d6131a5-b63e-42a5-905a-9ed5350a421a\") " pod="openshift-marketplace/certified-operators-v6vwc" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.640668 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d6131a5-b63e-42a5-905a-9ed5350a421a-catalog-content\") pod \"certified-operators-v6vwc\" (UID: \"1d6131a5-b63e-42a5-905a-9ed5350a421a\") " pod="openshift-marketplace/certified-operators-v6vwc" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.641299 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d6131a5-b63e-42a5-905a-9ed5350a421a-catalog-content\") pod \"certified-operators-v6vwc\" (UID: \"1d6131a5-b63e-42a5-905a-9ed5350a421a\") " pod="openshift-marketplace/certified-operators-v6vwc" Jan 21 14:30:05 crc kubenswrapper[4720]: E0121 14:30:05.641385 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:06.141367592 +0000 UTC m=+44.050107534 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.641607 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d6131a5-b63e-42a5-905a-9ed5350a421a-utilities\") pod \"certified-operators-v6vwc\" (UID: \"1d6131a5-b63e-42a5-905a-9ed5350a421a\") " pod="openshift-marketplace/certified-operators-v6vwc" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.661991 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v6vwc"] Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.679956 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5qbdf"] Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.725909 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmfz4\" (UniqueName: \"kubernetes.io/projected/1d6131a5-b63e-42a5-905a-9ed5350a421a-kube-api-access-dmfz4\") pod \"certified-operators-v6vwc\" (UID: \"1d6131a5-b63e-42a5-905a-9ed5350a421a\") " pod="openshift-marketplace/certified-operators-v6vwc" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.741418 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bbb0e48-d287-42fc-a165-86038d2083c9-utilities\") pod \"community-operators-5qbdf\" (UID: \"4bbb0e48-d287-42fc-a165-86038d2083c9\") " pod="openshift-marketplace/community-operators-5qbdf" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.741463 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfn9s\" (UniqueName: \"kubernetes.io/projected/4bbb0e48-d287-42fc-a165-86038d2083c9-kube-api-access-sfn9s\") pod \"community-operators-5qbdf\" (UID: \"4bbb0e48-d287-42fc-a165-86038d2083c9\") " pod="openshift-marketplace/community-operators-5qbdf" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.741508 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.741533 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bbb0e48-d287-42fc-a165-86038d2083c9-catalog-content\") pod \"community-operators-5qbdf\" (UID: \"4bbb0e48-d287-42fc-a165-86038d2083c9\") " pod="openshift-marketplace/community-operators-5qbdf" Jan 21 14:30:05 crc kubenswrapper[4720]: E0121 14:30:05.742016 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:06.242005504 +0000 UTC m=+44.150745436 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.842048 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.842685 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bbb0e48-d287-42fc-a165-86038d2083c9-utilities\") pod \"community-operators-5qbdf\" (UID: \"4bbb0e48-d287-42fc-a165-86038d2083c9\") " pod="openshift-marketplace/community-operators-5qbdf" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.842744 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfn9s\" (UniqueName: \"kubernetes.io/projected/4bbb0e48-d287-42fc-a165-86038d2083c9-kube-api-access-sfn9s\") pod \"community-operators-5qbdf\" (UID: \"4bbb0e48-d287-42fc-a165-86038d2083c9\") " pod="openshift-marketplace/community-operators-5qbdf" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.842804 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bbb0e48-d287-42fc-a165-86038d2083c9-catalog-content\") pod \"community-operators-5qbdf\" (UID: \"4bbb0e48-d287-42fc-a165-86038d2083c9\") " pod="openshift-marketplace/community-operators-5qbdf" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.843411 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bbb0e48-d287-42fc-a165-86038d2083c9-catalog-content\") pod \"community-operators-5qbdf\" (UID: \"4bbb0e48-d287-42fc-a165-86038d2083c9\") " pod="openshift-marketplace/community-operators-5qbdf" Jan 21 14:30:05 crc kubenswrapper[4720]: E0121 14:30:05.843487 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:06.34347082 +0000 UTC m=+44.252210752 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.843742 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bbb0e48-d287-42fc-a165-86038d2083c9-utilities\") pod \"community-operators-5qbdf\" (UID: \"4bbb0e48-d287-42fc-a165-86038d2083c9\") " pod="openshift-marketplace/community-operators-5qbdf" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.857608 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lt46m"] Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.858583 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lt46m" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.869430 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v6vwc" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.876240 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfn9s\" (UniqueName: \"kubernetes.io/projected/4bbb0e48-d287-42fc-a165-86038d2083c9-kube-api-access-sfn9s\") pod \"community-operators-5qbdf\" (UID: \"4bbb0e48-d287-42fc-a165-86038d2083c9\") " pod="openshift-marketplace/community-operators-5qbdf" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.897812 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lt46m"] Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.947593 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bb4c793-0d05-43f9-a9ad-30d9b6b40595-utilities\") pod \"certified-operators-lt46m\" (UID: \"7bb4c793-0d05-43f9-a9ad-30d9b6b40595\") " pod="openshift-marketplace/certified-operators-lt46m" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.947628 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxcbp\" (UniqueName: \"kubernetes.io/projected/7bb4c793-0d05-43f9-a9ad-30d9b6b40595-kube-api-access-kxcbp\") pod \"certified-operators-lt46m\" (UID: \"7bb4c793-0d05-43f9-a9ad-30d9b6b40595\") " pod="openshift-marketplace/certified-operators-lt46m" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.947689 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.947711 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bb4c793-0d05-43f9-a9ad-30d9b6b40595-catalog-content\") pod \"certified-operators-lt46m\" (UID: \"7bb4c793-0d05-43f9-a9ad-30d9b6b40595\") " pod="openshift-marketplace/certified-operators-lt46m" Jan 21 14:30:05 crc kubenswrapper[4720]: E0121 14:30:05.948167 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:06.448148456 +0000 UTC m=+44.356888458 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.961401 4720 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-86dvk container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:5443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.961478 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" podUID="25067bcc-8503-442b-b348-87d7e1321dbd" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.42:5443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.048549 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.048825 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bb4c793-0d05-43f9-a9ad-30d9b6b40595-utilities\") pod \"certified-operators-lt46m\" (UID: \"7bb4c793-0d05-43f9-a9ad-30d9b6b40595\") " pod="openshift-marketplace/certified-operators-lt46m" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.048857 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxcbp\" (UniqueName: \"kubernetes.io/projected/7bb4c793-0d05-43f9-a9ad-30d9b6b40595-kube-api-access-kxcbp\") pod \"certified-operators-lt46m\" (UID: \"7bb4c793-0d05-43f9-a9ad-30d9b6b40595\") " pod="openshift-marketplace/certified-operators-lt46m" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.048905 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bb4c793-0d05-43f9-a9ad-30d9b6b40595-catalog-content\") pod \"certified-operators-lt46m\" (UID: \"7bb4c793-0d05-43f9-a9ad-30d9b6b40595\") " pod="openshift-marketplace/certified-operators-lt46m" Jan 21 14:30:06 crc kubenswrapper[4720]: E0121 14:30:06.049280 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:06.54923966 +0000 UTC m=+44.457979592 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.049869 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bb4c793-0d05-43f9-a9ad-30d9b6b40595-catalog-content\") pod \"certified-operators-lt46m\" (UID: \"7bb4c793-0d05-43f9-a9ad-30d9b6b40595\") " pod="openshift-marketplace/certified-operators-lt46m" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.049987 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fwhvj"] Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.050172 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bb4c793-0d05-43f9-a9ad-30d9b6b40595-utilities\") pod \"certified-operators-lt46m\" (UID: \"7bb4c793-0d05-43f9-a9ad-30d9b6b40595\") " pod="openshift-marketplace/certified-operators-lt46m" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.050881 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fwhvj" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.094844 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-swblx" event={"ID":"22087b1b-3ded-441f-8349-fb8f38809460","Type":"ContainerStarted","Data":"c8eceb60572f6162c0c955f8aad2dcef86faf6e1f36706504931972ad6124fd0"} Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.096794 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-x48m6" event={"ID":"139c8416-e015-49e4-adfe-32f9e142621f","Type":"ContainerStarted","Data":"968913338eb8518c6dbbe73e98e64885086a293721a8f30f0b13f8c4d3aba2de"} Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.097788 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p" event={"ID":"c48951e9-42eb-461f-812e-adc413405821","Type":"ContainerStarted","Data":"651c91098a6b5beb1bb69833f5373a6ae3cd82dd60030a87f4a9c3ad1187b846"} Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.117124 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxcbp\" (UniqueName: \"kubernetes.io/projected/7bb4c793-0d05-43f9-a9ad-30d9b6b40595-kube-api-access-kxcbp\") pod \"certified-operators-lt46m\" (UID: \"7bb4c793-0d05-43f9-a9ad-30d9b6b40595\") " pod="openshift-marketplace/certified-operators-lt46m" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.119444 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" event={"ID":"75c0e088-7bdf-47f4-b434-b184e742d40a","Type":"ContainerStarted","Data":"316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55"} Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.119911 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.128870 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-n9vh6" event={"ID":"c3f6d778-ef18-4ad7-bd13-fb7e5983af23","Type":"ContainerStarted","Data":"fecb40ecb9afe912623a7175e16d918f02eabe55a49dcc1e0427eedc76b4af1b"} Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.133189 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:06 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:06 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:06 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.133255 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.151596 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-swblx" podStartSLOduration=24.15157865 podStartE2EDuration="24.15157865s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:06.15155972 +0000 UTC m=+44.060299672" watchObservedRunningTime="2026-01-21 14:30:06.15157865 +0000 UTC m=+44.060318582" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.152083 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d436685f-1f7d-454b-afa4-76389c5c5ff4-utilities\") pod \"community-operators-fwhvj\" (UID: \"d436685f-1f7d-454b-afa4-76389c5c5ff4\") " pod="openshift-marketplace/community-operators-fwhvj" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.152100 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-zvq7p" event={"ID":"9a655c79-a709-4d61-8209-200b86144e8b","Type":"ContainerStarted","Data":"49b6cad3737585c15ad5079706732ff2c45073b70a5e605429f16aca057087d7"} Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.152170 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swm4t\" (UniqueName: \"kubernetes.io/projected/d436685f-1f7d-454b-afa4-76389c5c5ff4-kube-api-access-swm4t\") pod \"community-operators-fwhvj\" (UID: \"d436685f-1f7d-454b-afa4-76389c5c5ff4\") " pod="openshift-marketplace/community-operators-fwhvj" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.152196 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d436685f-1f7d-454b-afa4-76389c5c5ff4-catalog-content\") pod \"community-operators-fwhvj\" (UID: \"d436685f-1f7d-454b-afa4-76389c5c5ff4\") " pod="openshift-marketplace/community-operators-fwhvj" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.152343 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:06 crc kubenswrapper[4720]: E0121 14:30:06.152705 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:06.652692782 +0000 UTC m=+44.561432714 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.176448 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm55" event={"ID":"03e0f458-ccd0-429e-ae37-d4c1fd2946bf","Type":"ContainerStarted","Data":"c768102736fae3e06a7547588d93ac80d31bd9677eec427f354190b508c4fbb2"} Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.176982 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm55" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.190674 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fwhvj"] Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.197119 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxq5z" event={"ID":"fd1cfb10-4405-4ab9-8631-690622069d01","Type":"ContainerStarted","Data":"4e8987d42ba4c5529c8b73b93b8fc3182fe8607ef6e1da8e5fd63d9195c2fd46"} Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.204257 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zxv2h" event={"ID":"1796695a-873c-4c15-9351-9b5bc5607830","Type":"ContainerStarted","Data":"d997dc3690941e9c3ebc8356d68a0504b34573122bc3142c2aecfa544b68ad28"} Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.211557 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" event={"ID":"90b6768c-8240-4fc1-a760-59d79a3c1c02","Type":"ContainerStarted","Data":"67b6377e3711859f3193aa79202ee17c2f2c10ee776aabe85bf9c31512f966ad"} Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.214898 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5qbdf" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.229667 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28" event={"ID":"5c00abc0-dc46-406c-8f2f-6904ac88126d","Type":"ContainerStarted","Data":"54aaf0431c1fc000e77bbdd850b6650c74c7fd8a9e5e80188424d5938f4ecefe"} Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.234011 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jtj6g" event={"ID":"48af697e-308a-4bdd-a5d8-d86cd5c4fb0c","Type":"ContainerStarted","Data":"98c98fbadf7f3e7f292782d1cddb7912face0c2f029cfd169f3551f42aaf30d1"} Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.235184 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_collect-profiles-29483415-hwxpp_d390eca3-a064-441f-b469-3111e626bcae/collect-profiles/0.log" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.235276 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.243981 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" podStartSLOduration=11.243963111 podStartE2EDuration="11.243963111s" podCreationTimestamp="2026-01-21 14:29:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:06.24287589 +0000 UTC m=+44.151615822" watchObservedRunningTime="2026-01-21 14:30:06.243963111 +0000 UTC m=+44.152703043" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.244306 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.249544 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5gg5l" event={"ID":"fccce0ee-16e1-4237-8081-a6a3c93c5851","Type":"ContainerStarted","Data":"6953a2573da38f7bc8140ee278e90e6b12d78a56e39a5f7fde5cb7802248bc99"} Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.253161 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.253234 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dqdc\" (UniqueName: \"kubernetes.io/projected/d390eca3-a064-441f-b469-3111e626bcae-kube-api-access-4dqdc\") pod \"d390eca3-a064-441f-b469-3111e626bcae\" (UID: \"d390eca3-a064-441f-b469-3111e626bcae\") " Jan 21 14:30:06 crc kubenswrapper[4720]: E0121 14:30:06.253283 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:06.753257901 +0000 UTC m=+44.661997833 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.253306 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d390eca3-a064-441f-b469-3111e626bcae-config-volume\") pod \"d390eca3-a064-441f-b469-3111e626bcae\" (UID: \"d390eca3-a064-441f-b469-3111e626bcae\") " Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.253364 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d390eca3-a064-441f-b469-3111e626bcae-secret-volume\") pod \"d390eca3-a064-441f-b469-3111e626bcae\" (UID: \"d390eca3-a064-441f-b469-3111e626bcae\") " Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.255233 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d390eca3-a064-441f-b469-3111e626bcae-config-volume" (OuterVolumeSpecName: "config-volume") pod "d390eca3-a064-441f-b469-3111e626bcae" (UID: "d390eca3-a064-441f-b469-3111e626bcae"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.255440 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:06 crc kubenswrapper[4720]: E0121 14:30:06.255792 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:06.755778542 +0000 UTC m=+44.664518474 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.256854 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d436685f-1f7d-454b-afa4-76389c5c5ff4-utilities\") pod \"community-operators-fwhvj\" (UID: \"d436685f-1f7d-454b-afa4-76389c5c5ff4\") " pod="openshift-marketplace/community-operators-fwhvj" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.257344 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swm4t\" (UniqueName: \"kubernetes.io/projected/d436685f-1f7d-454b-afa4-76389c5c5ff4-kube-api-access-swm4t\") pod \"community-operators-fwhvj\" (UID: \"d436685f-1f7d-454b-afa4-76389c5c5ff4\") " pod="openshift-marketplace/community-operators-fwhvj" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.257399 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d436685f-1f7d-454b-afa4-76389c5c5ff4-catalog-content\") pod \"community-operators-fwhvj\" (UID: \"d436685f-1f7d-454b-afa4-76389c5c5ff4\") " pod="openshift-marketplace/community-operators-fwhvj" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.257457 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d436685f-1f7d-454b-afa4-76389c5c5ff4-utilities\") pod \"community-operators-fwhvj\" (UID: \"d436685f-1f7d-454b-afa4-76389c5c5ff4\") " pod="openshift-marketplace/community-operators-fwhvj" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.257481 4720 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d390eca3-a064-441f-b469-3111e626bcae-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.257811 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d436685f-1f7d-454b-afa4-76389c5c5ff4-catalog-content\") pod \"community-operators-fwhvj\" (UID: \"d436685f-1f7d-454b-afa4-76389c5c5ff4\") " pod="openshift-marketplace/community-operators-fwhvj" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.296863 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_collect-profiles-29483415-hwxpp_d390eca3-a064-441f-b469-3111e626bcae/collect-profiles/0.log" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.296904 4720 generic.go:334] "Generic (PLEG): container finished" podID="d390eca3-a064-441f-b469-3111e626bcae" containerID="a8aefdfc6a0ce58c176fdd5bc916c454bc92f7fb8066e8d2403bc8df6646d409" exitCode=2 Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.296967 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp" event={"ID":"d390eca3-a064-441f-b469-3111e626bcae","Type":"ContainerDied","Data":"20bb4a1f28ddb483420180b50ee58dffb3342280cd3e1eb7cd43cf194a8782bb"} Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.296992 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp" event={"ID":"d390eca3-a064-441f-b469-3111e626bcae","Type":"ContainerDied","Data":"a8aefdfc6a0ce58c176fdd5bc916c454bc92f7fb8066e8d2403bc8df6646d409"} Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.297017 4720 scope.go:117] "RemoveContainer" containerID="a8aefdfc6a0ce58c176fdd5bc916c454bc92f7fb8066e8d2403bc8df6646d409" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.297109 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.299519 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swm4t\" (UniqueName: \"kubernetes.io/projected/d436685f-1f7d-454b-afa4-76389c5c5ff4-kube-api-access-swm4t\") pod \"community-operators-fwhvj\" (UID: \"d436685f-1f7d-454b-afa4-76389c5c5ff4\") " pod="openshift-marketplace/community-operators-fwhvj" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.313066 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-d7wmg" event={"ID":"4e042627-4d69-4cc5-a00d-849fe4ce76f0","Type":"ContainerStarted","Data":"c6cff6b05c13f3c17654a3bfac26f39af51d795e43510a703262697f21a80cac"} Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.324286 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d390eca3-a064-441f-b469-3111e626bcae-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d390eca3-a064-441f-b469-3111e626bcae" (UID: "d390eca3-a064-441f-b469-3111e626bcae"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.328778 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7mfnf" event={"ID":"92d3c944-8def-4f95-a3cb-781f929f5f28","Type":"ContainerStarted","Data":"e000f9d0c7d509713368b4e850e9619d534255a1c2acd7e5f742ea3f09048459"} Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.330907 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lt46m" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.336613 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d390eca3-a064-441f-b469-3111e626bcae-kube-api-access-4dqdc" (OuterVolumeSpecName: "kube-api-access-4dqdc") pod "d390eca3-a064-441f-b469-3111e626bcae" (UID: "d390eca3-a064-441f-b469-3111e626bcae"). InnerVolumeSpecName "kube-api-access-4dqdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.356298 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-s6zqk" event={"ID":"a6e00143-8d6c-45fb-aa6c-44015c27a3f1","Type":"ContainerStarted","Data":"d9c7637e04f8708eda047091f131189948f6f5c5b271d1b24e156faa2ee2aebe"} Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.361560 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.361921 4720 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d390eca3-a064-441f-b469-3111e626bcae-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.361935 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dqdc\" (UniqueName: \"kubernetes.io/projected/d390eca3-a064-441f-b469-3111e626bcae-kube-api-access-4dqdc\") on node \"crc\" DevicePath \"\"" Jan 21 14:30:06 crc kubenswrapper[4720]: E0121 14:30:06.362341 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:06.86232179 +0000 UTC m=+44.771061732 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.371839 4720 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-vxdw2 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.371901 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" podUID="90d203a9-910b-471c-afb5-e487b65136ac" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.373254 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zxv2h" podStartSLOduration=24.373243866 podStartE2EDuration="24.373243866s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:06.285373982 +0000 UTC m=+44.194113924" watchObservedRunningTime="2026-01-21 14:30:06.373243866 +0000 UTC m=+44.281983798" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.373835 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.470196 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:06 crc kubenswrapper[4720]: E0121 14:30:06.476900 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:06.976879452 +0000 UTC m=+44.885619384 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.486920 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fwhvj" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.545691 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxq5z" podStartSLOduration=24.545673561 podStartE2EDuration="24.545673561s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:06.545147747 +0000 UTC m=+44.453887699" watchObservedRunningTime="2026-01-21 14:30:06.545673561 +0000 UTC m=+44.454413493" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.547704 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-zvq7p" podStartSLOduration=24.547695449 podStartE2EDuration="24.547695449s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:06.469083494 +0000 UTC m=+44.377823446" watchObservedRunningTime="2026-01-21 14:30:06.547695449 +0000 UTC m=+44.456435401" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.579236 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:06 crc kubenswrapper[4720]: E0121 14:30:06.579392 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:07.079375566 +0000 UTC m=+44.988115498 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.579475 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:06 crc kubenswrapper[4720]: E0121 14:30:06.580116 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:07.080109738 +0000 UTC m=+44.988849670 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.584161 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" podStartSLOduration=24.584146791 podStartE2EDuration="24.584146791s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:06.582904676 +0000 UTC m=+44.491644608" watchObservedRunningTime="2026-01-21 14:30:06.584146791 +0000 UTC m=+44.492886723" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.657352 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm55" podStartSLOduration=24.657335453 podStartE2EDuration="24.657335453s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:06.655081069 +0000 UTC m=+44.563821031" watchObservedRunningTime="2026-01-21 14:30:06.657335453 +0000 UTC m=+44.566075395" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.681397 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:06 crc kubenswrapper[4720]: E0121 14:30:06.681758 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:07.181738987 +0000 UTC m=+45.090478929 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.718901 4720 scope.go:117] "RemoveContainer" containerID="a8aefdfc6a0ce58c176fdd5bc916c454bc92f7fb8066e8d2403bc8df6646d409" Jan 21 14:30:06 crc kubenswrapper[4720]: E0121 14:30:06.723912 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8aefdfc6a0ce58c176fdd5bc916c454bc92f7fb8066e8d2403bc8df6646d409\": container with ID starting with a8aefdfc6a0ce58c176fdd5bc916c454bc92f7fb8066e8d2403bc8df6646d409 not found: ID does not exist" containerID="a8aefdfc6a0ce58c176fdd5bc916c454bc92f7fb8066e8d2403bc8df6646d409" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.723960 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8aefdfc6a0ce58c176fdd5bc916c454bc92f7fb8066e8d2403bc8df6646d409"} err="failed to get container status \"a8aefdfc6a0ce58c176fdd5bc916c454bc92f7fb8066e8d2403bc8df6646d409\": rpc error: code = NotFound desc = could not find container \"a8aefdfc6a0ce58c176fdd5bc916c454bc92f7fb8066e8d2403bc8df6646d409\": container with ID starting with a8aefdfc6a0ce58c176fdd5bc916c454bc92f7fb8066e8d2403bc8df6646d409 not found: ID does not exist" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.735981 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl" podStartSLOduration=24.735963748 podStartE2EDuration="24.735963748s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:06.732647145 +0000 UTC m=+44.641387087" watchObservedRunningTime="2026-01-21 14:30:06.735963748 +0000 UTC m=+44.644703680" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.782522 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:06 crc kubenswrapper[4720]: E0121 14:30:06.785969 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:07.28594213 +0000 UTC m=+45.194682062 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.832503 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp"] Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.847010 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp"] Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.885211 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:06 crc kubenswrapper[4720]: E0121 14:30:06.885602 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:07.385583154 +0000 UTC m=+45.294323086 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.988741 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:06 crc kubenswrapper[4720]: E0121 14:30:06.989312 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:07.489299712 +0000 UTC m=+45.398039644 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.089817 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:07 crc kubenswrapper[4720]: E0121 14:30:07.090466 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:07.590446649 +0000 UTC m=+45.499186581 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.108468 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v6vwc"] Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.123329 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.123601 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.131212 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:07 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:07 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:07 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.131273 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.172323 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.172384 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.176281 4720 patch_prober.go:28] interesting pod/downloads-7954f5f757-wmxb9 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.176339 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-wmxb9" podUID="120bd3b2-5437-4a15-bcc4-32ae06eb7f1f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.176364 4720 patch_prober.go:28] interesting pod/downloads-7954f5f757-wmxb9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.176450 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wmxb9" podUID="120bd3b2-5437-4a15-bcc4-32ae06eb7f1f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.191208 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:07 crc kubenswrapper[4720]: E0121 14:30:07.193125 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:07.693110658 +0000 UTC m=+45.601850590 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.193515 4720 patch_prober.go:28] interesting pod/console-f9d7485db-42g76 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.193548 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-42g76" podUID="ac15d591-5558-4df9-b596-a1e27325bd6c" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.226862 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-nwj8k"] Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.295306 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:07 crc kubenswrapper[4720]: E0121 14:30:07.296048 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:07.796027574 +0000 UTC m=+45.704767506 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.296098 4720 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-v2pht container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.296130 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-v2pht" podUID="afb1ffca-e30f-47cf-b399-2bd057039b10" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.326816 4720 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-v2pht container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.326885 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-v2pht" podUID="afb1ffca-e30f-47cf-b399-2bd057039b10" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.382384 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-v2pht" Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.407684 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:07 crc kubenswrapper[4720]: E0121 14:30:07.407965 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:07.907952683 +0000 UTC m=+45.816692615 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.434668 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jtj6g" event={"ID":"48af697e-308a-4bdd-a5d8-d86cd5c4fb0c","Type":"ContainerStarted","Data":"519f47995dd784bb4cf274ea6fe463ec4e8006ef7435bade22b62727a5868bc1"} Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.469251 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5gg5l" event={"ID":"fccce0ee-16e1-4237-8081-a6a3c93c5851","Type":"ContainerStarted","Data":"67810c6149f6a6e59bc9eb73bbd632c8fe0e8a6d31d812492283062b75fbb016"} Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.470230 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5gg5l" Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.478820 4720 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-5gg5l container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.478887 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5gg5l" podUID="fccce0ee-16e1-4237-8081-a6a3c93c5851" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.498914 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p" event={"ID":"c48951e9-42eb-461f-812e-adc413405821","Type":"ContainerStarted","Data":"24914b76a0e5210499019f7f0b2d263f162c0daf747c7bb929ce8a0cf24ad2a4"} Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.508344 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:07 crc kubenswrapper[4720]: E0121 14:30:07.509512 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:08.009492931 +0000 UTC m=+45.918232873 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.543312 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-s6zqk" event={"ID":"a6e00143-8d6c-45fb-aa6c-44015c27a3f1","Type":"ContainerStarted","Data":"0434a9b612c70d163cdcd2c8e0ae7449dc94c05a57dde70fbc11296020e73ce4"} Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.556910 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28" event={"ID":"5c00abc0-dc46-406c-8f2f-6904ac88126d","Type":"ContainerStarted","Data":"a5faac90817005d0ec9a2561734998e41366a786d383a006ee579e39b73b4941"} Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.577387 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v6vwc" event={"ID":"1d6131a5-b63e-42a5-905a-9ed5350a421a","Type":"ContainerStarted","Data":"d3b5cdbc839bad4c3029ff33f78cd38f5b5e460e9963f6c280d92ade619bd510"} Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.579300 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" event={"ID":"cc0cfb45-abdd-434f-ae63-f6ae0fc7c092","Type":"ContainerStarted","Data":"8e689c54f4aa279c50cb943b25e6e08fb2eab35cf67ed899457c81521c71aeb4"} Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.609473 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:07 crc kubenswrapper[4720]: E0121 14:30:07.610376 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:08.110363669 +0000 UTC m=+46.019103601 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.621164 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jtj6g" podStartSLOduration=25.621139461 podStartE2EDuration="25.621139461s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:07.619094434 +0000 UTC m=+45.527834376" watchObservedRunningTime="2026-01-21 14:30:07.621139461 +0000 UTC m=+45.529879393" Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.624279 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" event={"ID":"0f685084-f748-4a34-9020-4d562f2a6d45","Type":"ContainerStarted","Data":"e88152461de95b2d3879730fa006e0e5249df7579f5958336bdfc79b7ce596e2"} Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.626546 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl" event={"ID":"8ac39f2f-2411-4585-b15c-c473b2fdc077","Type":"ContainerStarted","Data":"2a1760c0b0a1875048875fd14c37d6dbdd8f6c426ab4471ed1f5205716974b83"} Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.717128 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:07 crc kubenswrapper[4720]: E0121 14:30:07.718823 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:08.21879777 +0000 UTC m=+46.127537712 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.734642 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-d7wmg" event={"ID":"4e042627-4d69-4cc5-a00d-849fe4ce76f0","Type":"ContainerStarted","Data":"de1edc092793e7f1ba423fcf05a0d8e01201368f76f99b2ad5779465e9d8d83e"} Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.744083 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-njjgs" event={"ID":"bdfefc7f-6e59-460a-be36-220a37dd02d1","Type":"ContainerStarted","Data":"bfe30717a15d15ceb38ed46686648690145c7fb299f2f73f2c74a6bff0ccaf08"} Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.744794 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-njjgs" Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.758431 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-j577t" event={"ID":"e675e6aa-6d61-4490-b768-1dbe664d1dfe","Type":"ContainerStarted","Data":"df67309d4e16017900e2964193f307e14888f80c9c30ab9c6f7e58639fc93acf"} Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.762608 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-x48m6" event={"ID":"139c8416-e015-49e4-adfe-32f9e142621f","Type":"ContainerStarted","Data":"03c5d5d841f49c909c3e704a31faedc725965bd1e58ee925978fc2f5004161ab"} Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.770345 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7mfnf" event={"ID":"92d3c944-8def-4f95-a3cb-781f929f5f28","Type":"ContainerStarted","Data":"5eefcb917305426fe1b4c817f0e44feef59c159d7b3860fd5af7b252d9ed7f26"} Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.782965 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-n9vh6" event={"ID":"c3f6d778-ef18-4ad7-bd13-fb7e5983af23","Type":"ContainerStarted","Data":"99f28ed1d7f47d2c232a8199808cc278dd0935fb030dd791cb25673d70d9cd6a"} Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.819290 4720 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-vxdw2 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.819677 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" podUID="90d203a9-910b-471c-afb5-e487b65136ac" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.876707 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5gg5l" podStartSLOduration=25.876682248 podStartE2EDuration="25.876682248s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:07.833308211 +0000 UTC m=+45.742048173" watchObservedRunningTime="2026-01-21 14:30:07.876682248 +0000 UTC m=+45.785422190" Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.878493 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.885182 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fwhvj"] Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.893232 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:07 crc kubenswrapper[4720]: E0121 14:30:07.898566 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:08.398549201 +0000 UTC m=+46.307289123 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.986849 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-d7wmg" podStartSLOduration=25.986830087 podStartE2EDuration="25.986830087s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:07.985326714 +0000 UTC m=+45.894066666" watchObservedRunningTime="2026-01-21 14:30:07.986830087 +0000 UTC m=+45.895570029" Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.995223 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:07 crc kubenswrapper[4720]: E0121 14:30:07.996079 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:08.496048175 +0000 UTC m=+46.404788107 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.028631 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-c95rn"] Jan 21 14:30:08 crc kubenswrapper[4720]: E0121 14:30:08.028884 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d390eca3-a064-441f-b469-3111e626bcae" containerName="collect-profiles" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.028899 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="d390eca3-a064-441f-b469-3111e626bcae" containerName="collect-profiles" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.029015 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="d390eca3-a064-441f-b469-3111e626bcae" containerName="collect-profiles" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.029960 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c95rn" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.031898 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.063607 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lt46m"] Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.097585 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c95rn"] Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.102514 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8432f9d9-0168-4b49-b6a7-66281f46bd5a-catalog-content\") pod \"redhat-marketplace-c95rn\" (UID: \"8432f9d9-0168-4b49-b6a7-66281f46bd5a\") " pod="openshift-marketplace/redhat-marketplace-c95rn" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.102576 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8432f9d9-0168-4b49-b6a7-66281f46bd5a-utilities\") pod \"redhat-marketplace-c95rn\" (UID: \"8432f9d9-0168-4b49-b6a7-66281f46bd5a\") " pod="openshift-marketplace/redhat-marketplace-c95rn" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.102598 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4szdg\" (UniqueName: \"kubernetes.io/projected/8432f9d9-0168-4b49-b6a7-66281f46bd5a-kube-api-access-4szdg\") pod \"redhat-marketplace-c95rn\" (UID: \"8432f9d9-0168-4b49-b6a7-66281f46bd5a\") " pod="openshift-marketplace/redhat-marketplace-c95rn" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.102670 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:08 crc kubenswrapper[4720]: E0121 14:30:08.103001 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:08.602986274 +0000 UTC m=+46.511726206 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.120756 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.131181 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:08 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:08 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:08 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.131550 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.175425 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-njjgs" podStartSLOduration=13.175409965 podStartE2EDuration="13.175409965s" podCreationTimestamp="2026-01-21 14:29:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:08.132143751 +0000 UTC m=+46.040883693" watchObservedRunningTime="2026-01-21 14:30:08.175409965 +0000 UTC m=+46.084149897" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.176527 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" podStartSLOduration=26.176522096 podStartE2EDuration="26.176522096s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:08.17344785 +0000 UTC m=+46.082187792" watchObservedRunningTime="2026-01-21 14:30:08.176522096 +0000 UTC m=+46.085262028" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.178959 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5qbdf"] Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.210853 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:08 crc kubenswrapper[4720]: E0121 14:30:08.211119 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:08.711100976 +0000 UTC m=+46.619840908 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.211185 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8432f9d9-0168-4b49-b6a7-66281f46bd5a-catalog-content\") pod \"redhat-marketplace-c95rn\" (UID: \"8432f9d9-0168-4b49-b6a7-66281f46bd5a\") " pod="openshift-marketplace/redhat-marketplace-c95rn" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.211210 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8432f9d9-0168-4b49-b6a7-66281f46bd5a-utilities\") pod \"redhat-marketplace-c95rn\" (UID: \"8432f9d9-0168-4b49-b6a7-66281f46bd5a\") " pod="openshift-marketplace/redhat-marketplace-c95rn" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.211234 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4szdg\" (UniqueName: \"kubernetes.io/projected/8432f9d9-0168-4b49-b6a7-66281f46bd5a-kube-api-access-4szdg\") pod \"redhat-marketplace-c95rn\" (UID: \"8432f9d9-0168-4b49-b6a7-66281f46bd5a\") " pod="openshift-marketplace/redhat-marketplace-c95rn" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.218569 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8432f9d9-0168-4b49-b6a7-66281f46bd5a-catalog-content\") pod \"redhat-marketplace-c95rn\" (UID: \"8432f9d9-0168-4b49-b6a7-66281f46bd5a\") " pod="openshift-marketplace/redhat-marketplace-c95rn" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.218827 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8432f9d9-0168-4b49-b6a7-66281f46bd5a-utilities\") pod \"redhat-marketplace-c95rn\" (UID: \"8432f9d9-0168-4b49-b6a7-66281f46bd5a\") " pod="openshift-marketplace/redhat-marketplace-c95rn" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.234189 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-7mfnf" podStartSLOduration=26.234176263 podStartE2EDuration="26.234176263s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:08.233102902 +0000 UTC m=+46.141842844" watchObservedRunningTime="2026-01-21 14:30:08.234176263 +0000 UTC m=+46.142916185" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.253799 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jbtfr"] Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.254995 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jbtfr" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.275717 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p" podStartSLOduration=8.275705378 podStartE2EDuration="8.275705378s" podCreationTimestamp="2026-01-21 14:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:08.274939476 +0000 UTC m=+46.183679408" watchObservedRunningTime="2026-01-21 14:30:08.275705378 +0000 UTC m=+46.184445310" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.287250 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4szdg\" (UniqueName: \"kubernetes.io/projected/8432f9d9-0168-4b49-b6a7-66281f46bd5a-kube-api-access-4szdg\") pod \"redhat-marketplace-c95rn\" (UID: \"8432f9d9-0168-4b49-b6a7-66281f46bd5a\") " pod="openshift-marketplace/redhat-marketplace-c95rn" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.288478 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jbtfr"] Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.312354 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:08 crc kubenswrapper[4720]: E0121 14:30:08.313860 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:08.813848567 +0000 UTC m=+46.722588499 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.354899 4720 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-vxdw2 container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.355225 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" podUID="90d203a9-910b-471c-afb5-e487b65136ac" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.355474 4720 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-vxdw2 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.355604 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" podUID="90d203a9-910b-471c-afb5-e487b65136ac" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.367048 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c95rn" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.407201 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-s6zqk" podStartSLOduration=26.407183495 podStartE2EDuration="26.407183495s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:08.331491832 +0000 UTC m=+46.240231764" watchObservedRunningTime="2026-01-21 14:30:08.407183495 +0000 UTC m=+46.315923427" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.407755 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-n9vh6" podStartSLOduration=13.40775017 podStartE2EDuration="13.40775017s" podCreationTimestamp="2026-01-21 14:29:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:08.406032262 +0000 UTC m=+46.314772204" watchObservedRunningTime="2026-01-21 14:30:08.40775017 +0000 UTC m=+46.316490102" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.414259 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.414426 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa280405-236d-4a24-896d-04a2dfad8a3a-utilities\") pod \"redhat-marketplace-jbtfr\" (UID: \"aa280405-236d-4a24-896d-04a2dfad8a3a\") " pod="openshift-marketplace/redhat-marketplace-jbtfr" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.414467 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql9d6\" (UniqueName: \"kubernetes.io/projected/aa280405-236d-4a24-896d-04a2dfad8a3a-kube-api-access-ql9d6\") pod \"redhat-marketplace-jbtfr\" (UID: \"aa280405-236d-4a24-896d-04a2dfad8a3a\") " pod="openshift-marketplace/redhat-marketplace-jbtfr" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.414508 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa280405-236d-4a24-896d-04a2dfad8a3a-catalog-content\") pod \"redhat-marketplace-jbtfr\" (UID: \"aa280405-236d-4a24-896d-04a2dfad8a3a\") " pod="openshift-marketplace/redhat-marketplace-jbtfr" Jan 21 14:30:08 crc kubenswrapper[4720]: E0121 14:30:08.414641 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:08.914621503 +0000 UTC m=+46.823361435 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.502412 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5gg5l" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.520302 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.520357 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa280405-236d-4a24-896d-04a2dfad8a3a-utilities\") pod \"redhat-marketplace-jbtfr\" (UID: \"aa280405-236d-4a24-896d-04a2dfad8a3a\") " pod="openshift-marketplace/redhat-marketplace-jbtfr" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.520393 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql9d6\" (UniqueName: \"kubernetes.io/projected/aa280405-236d-4a24-896d-04a2dfad8a3a-kube-api-access-ql9d6\") pod \"redhat-marketplace-jbtfr\" (UID: \"aa280405-236d-4a24-896d-04a2dfad8a3a\") " pod="openshift-marketplace/redhat-marketplace-jbtfr" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.520431 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa280405-236d-4a24-896d-04a2dfad8a3a-catalog-content\") pod \"redhat-marketplace-jbtfr\" (UID: \"aa280405-236d-4a24-896d-04a2dfad8a3a\") " pod="openshift-marketplace/redhat-marketplace-jbtfr" Jan 21 14:30:08 crc kubenswrapper[4720]: E0121 14:30:08.520850 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:09.020821611 +0000 UTC m=+46.929561543 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.520866 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa280405-236d-4a24-896d-04a2dfad8a3a-catalog-content\") pod \"redhat-marketplace-jbtfr\" (UID: \"aa280405-236d-4a24-896d-04a2dfad8a3a\") " pod="openshift-marketplace/redhat-marketplace-jbtfr" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.521156 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa280405-236d-4a24-896d-04a2dfad8a3a-utilities\") pod \"redhat-marketplace-jbtfr\" (UID: \"aa280405-236d-4a24-896d-04a2dfad8a3a\") " pod="openshift-marketplace/redhat-marketplace-jbtfr" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.551450 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql9d6\" (UniqueName: \"kubernetes.io/projected/aa280405-236d-4a24-896d-04a2dfad8a3a-kube-api-access-ql9d6\") pod \"redhat-marketplace-jbtfr\" (UID: \"aa280405-236d-4a24-896d-04a2dfad8a3a\") " pod="openshift-marketplace/redhat-marketplace-jbtfr" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.622133 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:08 crc kubenswrapper[4720]: E0121 14:30:08.622435 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:09.122419781 +0000 UTC m=+47.031159713 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.638055 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jbtfr" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.641473 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-x7575"] Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.675530 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x7575" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.686361 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.717841 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d390eca3-a064-441f-b469-3111e626bcae" path="/var/lib/kubelet/pods/d390eca3-a064-441f-b469-3111e626bcae/volumes" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.718417 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x7575"] Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.723536 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:08 crc kubenswrapper[4720]: E0121 14:30:08.723874 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:09.223862925 +0000 UTC m=+47.132602857 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.827796 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.828158 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmzpc\" (UniqueName: \"kubernetes.io/projected/328ecaa4-59eb-4707-a320-245636d0c778-kube-api-access-mmzpc\") pod \"redhat-operators-x7575\" (UID: \"328ecaa4-59eb-4707-a320-245636d0c778\") " pod="openshift-marketplace/redhat-operators-x7575" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.828216 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/328ecaa4-59eb-4707-a320-245636d0c778-utilities\") pod \"redhat-operators-x7575\" (UID: \"328ecaa4-59eb-4707-a320-245636d0c778\") " pod="openshift-marketplace/redhat-operators-x7575" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.828237 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/328ecaa4-59eb-4707-a320-245636d0c778-catalog-content\") pod \"redhat-operators-x7575\" (UID: \"328ecaa4-59eb-4707-a320-245636d0c778\") " pod="openshift-marketplace/redhat-operators-x7575" Jan 21 14:30:08 crc kubenswrapper[4720]: E0121 14:30:08.829784 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:09.329767996 +0000 UTC m=+47.238507918 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.831056 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28" event={"ID":"5c00abc0-dc46-406c-8f2f-6904ac88126d","Type":"ContainerStarted","Data":"ea3e42f59558ac33a0401b68f0727a6c36ce0105fe3235e649f0da0ed75102ed"} Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.833391 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lt46m" event={"ID":"7bb4c793-0d05-43f9-a9ad-30d9b6b40595","Type":"ContainerStarted","Data":"328b3e95ade1caeae4e693dd7d243f33f61953dabc84aa7d096915ec1cb9417f"} Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.845299 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-52n8k"] Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.846410 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-52n8k" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.854724 4720 generic.go:334] "Generic (PLEG): container finished" podID="1d6131a5-b63e-42a5-905a-9ed5350a421a" containerID="30172f5a091dbf43920d3bc422d548a7d74dec60e89e6b9d22bca8e36b6c2ed1" exitCode=0 Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.855499 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v6vwc" event={"ID":"1d6131a5-b63e-42a5-905a-9ed5350a421a","Type":"ContainerDied","Data":"30172f5a091dbf43920d3bc422d548a7d74dec60e89e6b9d22bca8e36b6c2ed1"} Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.864270 4720 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.872225 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-52n8k"] Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.884418 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28" podStartSLOduration=26.884399938 podStartE2EDuration="26.884399938s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:08.884075578 +0000 UTC m=+46.792815520" watchObservedRunningTime="2026-01-21 14:30:08.884399938 +0000 UTC m=+46.793139870" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.893629 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5qbdf" event={"ID":"4bbb0e48-d287-42fc-a165-86038d2083c9","Type":"ContainerStarted","Data":"85ca11cc33d09ce2c8fd7bab9c3118f3fb41bcc9c4f1e36c585b8c6b04ce1492"} Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.911034 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-x48m6" event={"ID":"139c8416-e015-49e4-adfe-32f9e142621f","Type":"ContainerStarted","Data":"f20d09863be7ca61fa7342b49aa7455a47ce4ac93bdba3ee521a02fa9dc39c25"} Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.934193 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmzpc\" (UniqueName: \"kubernetes.io/projected/328ecaa4-59eb-4707-a320-245636d0c778-kube-api-access-mmzpc\") pod \"redhat-operators-x7575\" (UID: \"328ecaa4-59eb-4707-a320-245636d0c778\") " pod="openshift-marketplace/redhat-operators-x7575" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.934228 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.934277 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/328ecaa4-59eb-4707-a320-245636d0c778-utilities\") pod \"redhat-operators-x7575\" (UID: \"328ecaa4-59eb-4707-a320-245636d0c778\") " pod="openshift-marketplace/redhat-operators-x7575" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.934302 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/328ecaa4-59eb-4707-a320-245636d0c778-catalog-content\") pod \"redhat-operators-x7575\" (UID: \"328ecaa4-59eb-4707-a320-245636d0c778\") " pod="openshift-marketplace/redhat-operators-x7575" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.935195 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/328ecaa4-59eb-4707-a320-245636d0c778-utilities\") pod \"redhat-operators-x7575\" (UID: \"328ecaa4-59eb-4707-a320-245636d0c778\") " pod="openshift-marketplace/redhat-operators-x7575" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.935401 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/328ecaa4-59eb-4707-a320-245636d0c778-catalog-content\") pod \"redhat-operators-x7575\" (UID: \"328ecaa4-59eb-4707-a320-245636d0c778\") " pod="openshift-marketplace/redhat-operators-x7575" Jan 21 14:30:08 crc kubenswrapper[4720]: E0121 14:30:08.935561 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:09.435546911 +0000 UTC m=+47.344286843 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.936231 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" event={"ID":"0f685084-f748-4a34-9020-4d562f2a6d45","Type":"ContainerStarted","Data":"604be98ef64db9bf2dee8b6519ee20ed2b69e3f5461179f4162ff99dea72456c"} Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.959324 4720 generic.go:334] "Generic (PLEG): container finished" podID="d436685f-1f7d-454b-afa4-76389c5c5ff4" containerID="fea576e42ea53daf64f9e355cf2971b7c48351b927096e3397ea48c46de4d07f" exitCode=0 Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.960272 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fwhvj" event={"ID":"d436685f-1f7d-454b-afa4-76389c5c5ff4","Type":"ContainerDied","Data":"fea576e42ea53daf64f9e355cf2971b7c48351b927096e3397ea48c46de4d07f"} Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.960300 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fwhvj" event={"ID":"d436685f-1f7d-454b-afa4-76389c5c5ff4","Type":"ContainerStarted","Data":"f92665f685bf80e17e3e48269da656cf92cd51a8b00c063a085e7a0052993aa3"} Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.960388 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" podUID="75c0e088-7bdf-47f4-b434-b184e742d40a" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" gracePeriod=30 Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.972211 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.988573 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmzpc\" (UniqueName: \"kubernetes.io/projected/328ecaa4-59eb-4707-a320-245636d0c778-kube-api-access-mmzpc\") pod \"redhat-operators-x7575\" (UID: \"328ecaa4-59eb-4707-a320-245636d0c778\") " pod="openshift-marketplace/redhat-operators-x7575" Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.036471 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" podStartSLOduration=27.036453771 podStartE2EDuration="27.036453771s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:09.034195818 +0000 UTC m=+46.942935750" watchObservedRunningTime="2026-01-21 14:30:09.036453771 +0000 UTC m=+46.945193703" Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.037709 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.037957 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/306f9668-a044-448f-a14f-81c9726d3008-utilities\") pod \"redhat-operators-52n8k\" (UID: \"306f9668-a044-448f-a14f-81c9726d3008\") " pod="openshift-marketplace/redhat-operators-52n8k" Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.038051 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4pfg\" (UniqueName: \"kubernetes.io/projected/306f9668-a044-448f-a14f-81c9726d3008-kube-api-access-t4pfg\") pod \"redhat-operators-52n8k\" (UID: \"306f9668-a044-448f-a14f-81c9726d3008\") " pod="openshift-marketplace/redhat-operators-52n8k" Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.038131 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/306f9668-a044-448f-a14f-81c9726d3008-catalog-content\") pod \"redhat-operators-52n8k\" (UID: \"306f9668-a044-448f-a14f-81c9726d3008\") " pod="openshift-marketplace/redhat-operators-52n8k" Jan 21 14:30:09 crc kubenswrapper[4720]: E0121 14:30:09.038218 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:09.53820665 +0000 UTC m=+47.446946582 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.040116 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x7575" Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.090716 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c95rn"] Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.133895 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:09 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:09 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:09 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.133950 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.141370 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/306f9668-a044-448f-a14f-81c9726d3008-catalog-content\") pod \"redhat-operators-52n8k\" (UID: \"306f9668-a044-448f-a14f-81c9726d3008\") " pod="openshift-marketplace/redhat-operators-52n8k" Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.141509 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/306f9668-a044-448f-a14f-81c9726d3008-utilities\") pod \"redhat-operators-52n8k\" (UID: \"306f9668-a044-448f-a14f-81c9726d3008\") " pod="openshift-marketplace/redhat-operators-52n8k" Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.141590 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.141710 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4pfg\" (UniqueName: \"kubernetes.io/projected/306f9668-a044-448f-a14f-81c9726d3008-kube-api-access-t4pfg\") pod \"redhat-operators-52n8k\" (UID: \"306f9668-a044-448f-a14f-81c9726d3008\") " pod="openshift-marketplace/redhat-operators-52n8k" Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.146424 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/306f9668-a044-448f-a14f-81c9726d3008-utilities\") pod \"redhat-operators-52n8k\" (UID: \"306f9668-a044-448f-a14f-81c9726d3008\") " pod="openshift-marketplace/redhat-operators-52n8k" Jan 21 14:30:09 crc kubenswrapper[4720]: E0121 14:30:09.148055 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:09.648043071 +0000 UTC m=+47.556783003 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.150928 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/306f9668-a044-448f-a14f-81c9726d3008-catalog-content\") pod \"redhat-operators-52n8k\" (UID: \"306f9668-a044-448f-a14f-81c9726d3008\") " pod="openshift-marketplace/redhat-operators-52n8k" Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.188557 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-x48m6" podStartSLOduration=28.188540417 podStartE2EDuration="28.188540417s" podCreationTimestamp="2026-01-21 14:29:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:09.185147841 +0000 UTC m=+47.093887763" watchObservedRunningTime="2026-01-21 14:30:09.188540417 +0000 UTC m=+47.097280349" Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.196955 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4pfg\" (UniqueName: \"kubernetes.io/projected/306f9668-a044-448f-a14f-81c9726d3008-kube-api-access-t4pfg\") pod \"redhat-operators-52n8k\" (UID: \"306f9668-a044-448f-a14f-81c9726d3008\") " pod="openshift-marketplace/redhat-operators-52n8k" Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.280584 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:09 crc kubenswrapper[4720]: E0121 14:30:09.281921 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:09.781895815 +0000 UTC m=+47.690635747 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.294372 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-52n8k" Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.387027 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:09 crc kubenswrapper[4720]: E0121 14:30:09.387314 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:09.8873025 +0000 UTC m=+47.796042432 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.474341 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jbtfr"] Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.488173 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:09 crc kubenswrapper[4720]: E0121 14:30:09.488461 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:09.988443487 +0000 UTC m=+47.897183419 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.588771 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:09 crc kubenswrapper[4720]: E0121 14:30:09.589314 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:10.089297275 +0000 UTC m=+47.998037207 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.689428 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:09 crc kubenswrapper[4720]: E0121 14:30:09.689643 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:10.189617688 +0000 UTC m=+48.098357610 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.690183 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:09 crc kubenswrapper[4720]: E0121 14:30:09.690531 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:10.190515903 +0000 UTC m=+48.099255845 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.790679 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:09 crc kubenswrapper[4720]: E0121 14:30:09.790910 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:10.290887648 +0000 UTC m=+48.199627580 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.893958 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:09 crc kubenswrapper[4720]: E0121 14:30:09.894472 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:10.394454433 +0000 UTC m=+48.303194365 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.983922 4720 generic.go:334] "Generic (PLEG): container finished" podID="aa280405-236d-4a24-896d-04a2dfad8a3a" containerID="c1a8d2ebb0742bda9eb8111d769438c7e4a1020d8851b592dee2b79a7eae798a" exitCode=0 Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.984187 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jbtfr" event={"ID":"aa280405-236d-4a24-896d-04a2dfad8a3a","Type":"ContainerDied","Data":"c1a8d2ebb0742bda9eb8111d769438c7e4a1020d8851b592dee2b79a7eae798a"} Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.984242 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jbtfr" event={"ID":"aa280405-236d-4a24-896d-04a2dfad8a3a","Type":"ContainerStarted","Data":"79477e1af1d10f20ff2bf7e280a7ee476108ea069779af5f5cdc35424364da3b"} Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.993564 4720 generic.go:334] "Generic (PLEG): container finished" podID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" containerID="14e886daf1a3a6b869ffcf74d313a6df0c2abaf901b1048767f8b1caf48b8b35" exitCode=0 Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.993683 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lt46m" event={"ID":"7bb4c793-0d05-43f9-a9ad-30d9b6b40595","Type":"ContainerDied","Data":"14e886daf1a3a6b869ffcf74d313a6df0c2abaf901b1048767f8b1caf48b8b35"} Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.994868 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:09 crc kubenswrapper[4720]: E0121 14:30:09.995017 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:10.494994782 +0000 UTC m=+48.403734714 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.995212 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:09 crc kubenswrapper[4720]: E0121 14:30:09.995534 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:10.495521347 +0000 UTC m=+48.404261279 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.005414 4720 generic.go:334] "Generic (PLEG): container finished" podID="4bbb0e48-d287-42fc-a165-86038d2083c9" containerID="913d1830ceed6ae40bbb4c04398f1f327e8c16bbd8735fc74ac413d3ad20a3f6" exitCode=0 Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.005497 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5qbdf" event={"ID":"4bbb0e48-d287-42fc-a165-86038d2083c9","Type":"ContainerDied","Data":"913d1830ceed6ae40bbb4c04398f1f327e8c16bbd8735fc74ac413d3ad20a3f6"} Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.010936 4720 generic.go:334] "Generic (PLEG): container finished" podID="8432f9d9-0168-4b49-b6a7-66281f46bd5a" containerID="aff1c70dad44e4a9235659b4d3aa767982940cb15a22e40293f61a1cba5b043d" exitCode=0 Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.011021 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c95rn" event={"ID":"8432f9d9-0168-4b49-b6a7-66281f46bd5a","Type":"ContainerDied","Data":"aff1c70dad44e4a9235659b4d3aa767982940cb15a22e40293f61a1cba5b043d"} Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.011056 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c95rn" event={"ID":"8432f9d9-0168-4b49-b6a7-66281f46bd5a","Type":"ContainerStarted","Data":"9c2892b80c1d95c871202545822430a42e2c2316e71ccc122df3bcadd593a956"} Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.017982 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-j577t" event={"ID":"e675e6aa-6d61-4490-b768-1dbe664d1dfe","Type":"ContainerStarted","Data":"0a63c057137dd1dce00640395cbccb63187990cb6f7fcaffbda1530cf924ee49"} Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.098083 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:10 crc kubenswrapper[4720]: E0121 14:30:10.100235 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:10.600217723 +0000 UTC m=+48.508957655 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.137938 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:10 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:10 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:10 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.138002 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.202182 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:10 crc kubenswrapper[4720]: E0121 14:30:10.202442 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:10.702431389 +0000 UTC m=+48.611171321 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.303414 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:10 crc kubenswrapper[4720]: E0121 14:30:10.303513 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:10.803498183 +0000 UTC m=+48.712238115 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.303803 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:10 crc kubenswrapper[4720]: E0121 14:30:10.304092 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:10.80408448 +0000 UTC m=+48.712824402 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.405106 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:10 crc kubenswrapper[4720]: E0121 14:30:10.405298 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:10.905271138 +0000 UTC m=+48.814011070 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.405365 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:10 crc kubenswrapper[4720]: E0121 14:30:10.405677 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:10.905666658 +0000 UTC m=+48.814406590 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.479079 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-52n8k"] Jan 21 14:30:10 crc kubenswrapper[4720]: E0121 14:30:10.506765 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:11.006742103 +0000 UTC m=+48.915482035 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.506807 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.506898 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:10 crc kubenswrapper[4720]: E0121 14:30:10.507233 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:11.007224247 +0000 UTC m=+48.915964179 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.591259 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x7575"] Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.607922 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:10 crc kubenswrapper[4720]: E0121 14:30:10.609679 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:11.109634639 +0000 UTC m=+49.018374571 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:10 crc kubenswrapper[4720]: W0121 14:30:10.621397 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod328ecaa4_59eb_4707_a320_245636d0c778.slice/crio-b8380163a8adecc8544c40abaa4d48a79fd0c040f667b22d69969e4736058c2d WatchSource:0}: Error finding container b8380163a8adecc8544c40abaa4d48a79fd0c040f667b22d69969e4736058c2d: Status 404 returned error can't find the container with id b8380163a8adecc8544c40abaa4d48a79fd0c040f667b22d69969e4736058c2d Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.717820 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:10 crc kubenswrapper[4720]: E0121 14:30:10.718255 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:11.218231354 +0000 UTC m=+49.126971286 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.838543 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:10 crc kubenswrapper[4720]: E0121 14:30:10.838685 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:11.338666751 +0000 UTC m=+49.247406683 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.838791 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:10 crc kubenswrapper[4720]: E0121 14:30:10.839089 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:11.339081373 +0000 UTC m=+49.247821305 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.940119 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:10 crc kubenswrapper[4720]: E0121 14:30:10.940640 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:11.44062282 +0000 UTC m=+49.349362752 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.033879 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-52n8k" event={"ID":"306f9668-a044-448f-a14f-81c9726d3008","Type":"ContainerStarted","Data":"23db2e3dd80933444006432f7c28ae6c0623796c99b317cd90c3617bb24ec475"} Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.034295 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-52n8k" event={"ID":"306f9668-a044-448f-a14f-81c9726d3008","Type":"ContainerStarted","Data":"26891b408ccd24b0c8434d044528c04f82c156ee44333c5cc05cf38ad2ef94ce"} Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.041498 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:11 crc kubenswrapper[4720]: E0121 14:30:11.041967 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:11.541945752 +0000 UTC m=+49.450685684 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.044057 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x7575" event={"ID":"328ecaa4-59eb-4707-a320-245636d0c778","Type":"ContainerStarted","Data":"b8380163a8adecc8544c40abaa4d48a79fd0c040f667b22d69969e4736058c2d"} Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.122990 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:11 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:11 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:11 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.123272 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.142513 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:11 crc kubenswrapper[4720]: E0121 14:30:11.142693 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:11.642666777 +0000 UTC m=+49.551406709 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.142935 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:11 crc kubenswrapper[4720]: E0121 14:30:11.143250 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:11.643237103 +0000 UTC m=+49.551977035 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.244962 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:11 crc kubenswrapper[4720]: E0121 14:30:11.245126 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:11.745097159 +0000 UTC m=+49.653837101 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.245245 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:11 crc kubenswrapper[4720]: E0121 14:30:11.245590 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:11.745581932 +0000 UTC m=+49.654321864 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.345949 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:11 crc kubenswrapper[4720]: E0121 14:30:11.346062 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:11.84603704 +0000 UTC m=+49.754776962 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.346353 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:11 crc kubenswrapper[4720]: E0121 14:30:11.347193 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:11.847180702 +0000 UTC m=+49.755920634 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.365383 4720 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.450122 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:11 crc kubenswrapper[4720]: E0121 14:30:11.450339 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:11.950312154 +0000 UTC m=+49.859052086 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.451396 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:11 crc kubenswrapper[4720]: E0121 14:30:11.451768 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:11.951755074 +0000 UTC m=+49.860495006 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.552107 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:11 crc kubenswrapper[4720]: E0121 14:30:11.552263 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:12.052233472 +0000 UTC m=+49.960973404 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.552374 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:11 crc kubenswrapper[4720]: E0121 14:30:11.552686 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:12.052670724 +0000 UTC m=+49.961410656 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.655495 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:11 crc kubenswrapper[4720]: E0121 14:30:11.655812 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:12.155797466 +0000 UTC m=+50.064537398 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.680248 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.680950 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.690482 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.690621 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.694855 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.761275 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0389f8b8-4893-4619-b91f-0f2ef883fd85-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0389f8b8-4893-4619-b91f-0f2ef883fd85\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.761333 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0389f8b8-4893-4619-b91f-0f2ef883fd85-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0389f8b8-4893-4619-b91f-0f2ef883fd85\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.761420 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:11 crc kubenswrapper[4720]: E0121 14:30:11.761710 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:12.261698166 +0000 UTC m=+50.170438098 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.864513 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.864874 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0389f8b8-4893-4619-b91f-0f2ef883fd85-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0389f8b8-4893-4619-b91f-0f2ef883fd85\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.864912 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0389f8b8-4893-4619-b91f-0f2ef883fd85-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0389f8b8-4893-4619-b91f-0f2ef883fd85\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 14:30:11 crc kubenswrapper[4720]: E0121 14:30:11.865364 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:12.365345043 +0000 UTC m=+50.274084985 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.865402 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0389f8b8-4893-4619-b91f-0f2ef883fd85-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0389f8b8-4893-4619-b91f-0f2ef883fd85\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.885100 4720 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-01-21T14:30:11.365409613Z","Handler":null,"Name":""} Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.896370 4720 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.896408 4720 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.920269 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0389f8b8-4893-4619-b91f-0f2ef883fd85-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0389f8b8-4893-4619-b91f-0f2ef883fd85\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.969348 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.972708 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.974341 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.979002 4720 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.979048 4720 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.983605 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.984001 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.985860 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.031700 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.078424 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/38609d5a-a946-4abf-8b84-2e90a636844a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"38609d5a-a946-4abf-8b84-2e90a636844a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.078474 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/38609d5a-a946-4abf-8b84-2e90a636844a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"38609d5a-a946-4abf-8b84-2e90a636844a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.123029 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:12 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:12 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:12 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.123079 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.180443 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/38609d5a-a946-4abf-8b84-2e90a636844a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"38609d5a-a946-4abf-8b84-2e90a636844a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.180485 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/38609d5a-a946-4abf-8b84-2e90a636844a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"38609d5a-a946-4abf-8b84-2e90a636844a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.180858 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/38609d5a-a946-4abf-8b84-2e90a636844a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"38609d5a-a946-4abf-8b84-2e90a636844a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.184297 4720 generic.go:334] "Generic (PLEG): container finished" podID="306f9668-a044-448f-a14f-81c9726d3008" containerID="23db2e3dd80933444006432f7c28ae6c0623796c99b317cd90c3617bb24ec475" exitCode=0 Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.184357 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-52n8k" event={"ID":"306f9668-a044-448f-a14f-81c9726d3008","Type":"ContainerDied","Data":"23db2e3dd80933444006432f7c28ae6c0623796c99b317cd90c3617bb24ec475"} Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.237961 4720 generic.go:334] "Generic (PLEG): container finished" podID="328ecaa4-59eb-4707-a320-245636d0c778" containerID="ffff4cc4dd421fc3d140565ccec23e5e6c9e5bcc82c5cbbe3391d78e0a095744" exitCode=0 Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.238073 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x7575" event={"ID":"328ecaa4-59eb-4707-a320-245636d0c778","Type":"ContainerDied","Data":"ffff4cc4dd421fc3d140565ccec23e5e6c9e5bcc82c5cbbe3391d78e0a095744"} Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.239944 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/38609d5a-a946-4abf-8b84-2e90a636844a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"38609d5a-a946-4abf-8b84-2e90a636844a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.272930 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.279487 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-j577t" event={"ID":"e675e6aa-6d61-4490-b768-1dbe664d1dfe","Type":"ContainerStarted","Data":"eb9a838c233ac9fd43524b7ce216d0485f99ae1db53c573564d9447916affa15"} Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.282116 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.315236 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.342737 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.506420 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.714784 4720 patch_prober.go:28] interesting pod/apiserver-76f77b778f-pm8dm container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jan 21 14:30:12 crc kubenswrapper[4720]: [+]log ok Jan 21 14:30:12 crc kubenswrapper[4720]: [+]etcd ok Jan 21 14:30:12 crc kubenswrapper[4720]: [+]poststarthook/start-apiserver-admission-initializer ok Jan 21 14:30:12 crc kubenswrapper[4720]: [+]poststarthook/generic-apiserver-start-informers ok Jan 21 14:30:12 crc kubenswrapper[4720]: [+]poststarthook/max-in-flight-filter ok Jan 21 14:30:12 crc kubenswrapper[4720]: [+]poststarthook/storage-object-count-tracker-hook ok Jan 21 14:30:12 crc kubenswrapper[4720]: [+]poststarthook/image.openshift.io-apiserver-caches ok Jan 21 14:30:12 crc kubenswrapper[4720]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Jan 21 14:30:12 crc kubenswrapper[4720]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Jan 21 14:30:12 crc kubenswrapper[4720]: [+]poststarthook/project.openshift.io-projectcache ok Jan 21 14:30:12 crc kubenswrapper[4720]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Jan 21 14:30:12 crc kubenswrapper[4720]: [+]poststarthook/openshift.io-startinformers ok Jan 21 14:30:12 crc kubenswrapper[4720]: [+]poststarthook/openshift.io-restmapperupdater ok Jan 21 14:30:12 crc kubenswrapper[4720]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jan 21 14:30:12 crc kubenswrapper[4720]: livez check failed Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.714834 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" podUID="0f685084-f748-4a34-9020-4d562f2a6d45" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.715775 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.719976 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.720014 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.889530 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 21 14:30:13 crc kubenswrapper[4720]: I0121 14:30:13.024891 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 21 14:30:13 crc kubenswrapper[4720]: I0121 14:30:13.084215 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6kjwf"] Jan 21 14:30:13 crc kubenswrapper[4720]: I0121 14:30:13.099789 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-njjgs" Jan 21 14:30:13 crc kubenswrapper[4720]: I0121 14:30:13.125973 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:13 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:13 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:13 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:13 crc kubenswrapper[4720]: I0121 14:30:13.126055 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:13 crc kubenswrapper[4720]: I0121 14:30:13.411059 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-j577t" event={"ID":"e675e6aa-6d61-4490-b768-1dbe664d1dfe","Type":"ContainerStarted","Data":"1d976a40674b8dfa980b1373ae0e6473bf7974dd711406c75edbb43b7bcd7b54"} Jan 21 14:30:13 crc kubenswrapper[4720]: I0121 14:30:13.446909 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"38609d5a-a946-4abf-8b84-2e90a636844a","Type":"ContainerStarted","Data":"75d81f40cb1aa80276ca163d8e80d21e159b813861ad79754529e04e661489ce"} Jan 21 14:30:13 crc kubenswrapper[4720]: I0121 14:30:13.449580 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" event={"ID":"ccf13312-4caa-4898-9dd3-3f9614ecee01","Type":"ContainerStarted","Data":"cc78447803378e22f6cbae3e9270bdc6d0ee1630fceb9cd43ec6c839a71ce985"} Jan 21 14:30:13 crc kubenswrapper[4720]: I0121 14:30:13.450245 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0389f8b8-4893-4619-b91f-0f2ef883fd85","Type":"ContainerStarted","Data":"bcd668709459a7d9349dcca3b2f18a9e0573e3a3a954d0f356bfcecf5d43778f"} Jan 21 14:30:14 crc kubenswrapper[4720]: I0121 14:30:14.124026 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:14 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:14 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:14 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:14 crc kubenswrapper[4720]: I0121 14:30:14.125155 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:14 crc kubenswrapper[4720]: I0121 14:30:14.473325 4720 generic.go:334] "Generic (PLEG): container finished" podID="c48951e9-42eb-461f-812e-adc413405821" containerID="24914b76a0e5210499019f7f0b2d263f162c0daf747c7bb929ce8a0cf24ad2a4" exitCode=0 Jan 21 14:30:14 crc kubenswrapper[4720]: I0121 14:30:14.473394 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p" event={"ID":"c48951e9-42eb-461f-812e-adc413405821","Type":"ContainerDied","Data":"24914b76a0e5210499019f7f0b2d263f162c0daf747c7bb929ce8a0cf24ad2a4"} Jan 21 14:30:14 crc kubenswrapper[4720]: I0121 14:30:14.545048 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-j577t" podStartSLOduration=19.545016921 podStartE2EDuration="19.545016921s" podCreationTimestamp="2026-01-21 14:29:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:14.542972523 +0000 UTC m=+52.451712475" watchObservedRunningTime="2026-01-21 14:30:14.545016921 +0000 UTC m=+52.453756853" Jan 21 14:30:14 crc kubenswrapper[4720]: I0121 14:30:14.778311 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 21 14:30:14 crc kubenswrapper[4720]: I0121 14:30:14.814065 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 21 14:30:15 crc kubenswrapper[4720]: I0121 14:30:15.125855 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:15 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:15 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:15 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:15 crc kubenswrapper[4720]: I0121 14:30:15.126119 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:15 crc kubenswrapper[4720]: I0121 14:30:15.540942 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" event={"ID":"ccf13312-4caa-4898-9dd3-3f9614ecee01","Type":"ContainerStarted","Data":"e3c917729cbed0b95bf83042f4024bb09e5fcf08063dcc9274062f7754ff9a3d"} Jan 21 14:30:15 crc kubenswrapper[4720]: I0121 14:30:15.541087 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:15 crc kubenswrapper[4720]: I0121 14:30:15.546188 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0389f8b8-4893-4619-b91f-0f2ef883fd85","Type":"ContainerStarted","Data":"780a7d93308f4ebd077233412e2b3ef3d2859a907091858637f7731a5de212e5"} Jan 21 14:30:15 crc kubenswrapper[4720]: I0121 14:30:15.549053 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"38609d5a-a946-4abf-8b84-2e90a636844a","Type":"ContainerStarted","Data":"bdb6a29b95e5e8d428c82dd29717231f2fc0234e21a7718e8c673c84ad6c02ec"} Jan 21 14:30:15 crc kubenswrapper[4720]: I0121 14:30:15.570054 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" podStartSLOduration=33.570033626 podStartE2EDuration="33.570033626s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:15.568155523 +0000 UTC m=+53.476895455" watchObservedRunningTime="2026-01-21 14:30:15.570033626 +0000 UTC m=+53.478773558" Jan 21 14:30:15 crc kubenswrapper[4720]: I0121 14:30:15.593588 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=1.593553185 podStartE2EDuration="1.593553185s" podCreationTimestamp="2026-01-21 14:30:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:15.583167044 +0000 UTC m=+53.491906986" watchObservedRunningTime="2026-01-21 14:30:15.593553185 +0000 UTC m=+53.502293137" Jan 21 14:30:15 crc kubenswrapper[4720]: I0121 14:30:15.630626 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=4.630611684 podStartE2EDuration="4.630611684s" podCreationTimestamp="2026-01-21 14:30:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:15.628640569 +0000 UTC m=+53.537380511" watchObservedRunningTime="2026-01-21 14:30:15.630611684 +0000 UTC m=+53.539351616" Jan 21 14:30:15 crc kubenswrapper[4720]: I0121 14:30:15.646208 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=4.646193301 podStartE2EDuration="4.646193301s" podCreationTimestamp="2026-01-21 14:30:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:15.643938918 +0000 UTC m=+53.552678870" watchObservedRunningTime="2026-01-21 14:30:15.646193301 +0000 UTC m=+53.554933233" Jan 21 14:30:15 crc kubenswrapper[4720]: I0121 14:30:15.905762 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p" Jan 21 14:30:16 crc kubenswrapper[4720]: I0121 14:30:16.103428 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rl6h\" (UniqueName: \"kubernetes.io/projected/c48951e9-42eb-461f-812e-adc413405821-kube-api-access-4rl6h\") pod \"c48951e9-42eb-461f-812e-adc413405821\" (UID: \"c48951e9-42eb-461f-812e-adc413405821\") " Jan 21 14:30:16 crc kubenswrapper[4720]: I0121 14:30:16.103479 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c48951e9-42eb-461f-812e-adc413405821-secret-volume\") pod \"c48951e9-42eb-461f-812e-adc413405821\" (UID: \"c48951e9-42eb-461f-812e-adc413405821\") " Jan 21 14:30:16 crc kubenswrapper[4720]: I0121 14:30:16.103573 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c48951e9-42eb-461f-812e-adc413405821-config-volume\") pod \"c48951e9-42eb-461f-812e-adc413405821\" (UID: \"c48951e9-42eb-461f-812e-adc413405821\") " Jan 21 14:30:16 crc kubenswrapper[4720]: I0121 14:30:16.104542 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c48951e9-42eb-461f-812e-adc413405821-config-volume" (OuterVolumeSpecName: "config-volume") pod "c48951e9-42eb-461f-812e-adc413405821" (UID: "c48951e9-42eb-461f-812e-adc413405821"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:30:16 crc kubenswrapper[4720]: I0121 14:30:16.116634 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c48951e9-42eb-461f-812e-adc413405821-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c48951e9-42eb-461f-812e-adc413405821" (UID: "c48951e9-42eb-461f-812e-adc413405821"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:30:16 crc kubenswrapper[4720]: I0121 14:30:16.128485 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:16 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:16 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:16 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:16 crc kubenswrapper[4720]: I0121 14:30:16.128562 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:16 crc kubenswrapper[4720]: I0121 14:30:16.130361 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c48951e9-42eb-461f-812e-adc413405821-kube-api-access-4rl6h" (OuterVolumeSpecName: "kube-api-access-4rl6h") pod "c48951e9-42eb-461f-812e-adc413405821" (UID: "c48951e9-42eb-461f-812e-adc413405821"). InnerVolumeSpecName "kube-api-access-4rl6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:30:16 crc kubenswrapper[4720]: I0121 14:30:16.204927 4720 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c48951e9-42eb-461f-812e-adc413405821-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 14:30:16 crc kubenswrapper[4720]: I0121 14:30:16.204965 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rl6h\" (UniqueName: \"kubernetes.io/projected/c48951e9-42eb-461f-812e-adc413405821-kube-api-access-4rl6h\") on node \"crc\" DevicePath \"\"" Jan 21 14:30:16 crc kubenswrapper[4720]: I0121 14:30:16.204979 4720 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c48951e9-42eb-461f-812e-adc413405821-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 14:30:16 crc kubenswrapper[4720]: I0121 14:30:16.561633 4720 generic.go:334] "Generic (PLEG): container finished" podID="38609d5a-a946-4abf-8b84-2e90a636844a" containerID="bdb6a29b95e5e8d428c82dd29717231f2fc0234e21a7718e8c673c84ad6c02ec" exitCode=0 Jan 21 14:30:16 crc kubenswrapper[4720]: I0121 14:30:16.561708 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"38609d5a-a946-4abf-8b84-2e90a636844a","Type":"ContainerDied","Data":"bdb6a29b95e5e8d428c82dd29717231f2fc0234e21a7718e8c673c84ad6c02ec"} Jan 21 14:30:16 crc kubenswrapper[4720]: I0121 14:30:16.598092 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p" Jan 21 14:30:16 crc kubenswrapper[4720]: I0121 14:30:16.598085 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p" event={"ID":"c48951e9-42eb-461f-812e-adc413405821","Type":"ContainerDied","Data":"651c91098a6b5beb1bb69833f5373a6ae3cd82dd60030a87f4a9c3ad1187b846"} Jan 21 14:30:16 crc kubenswrapper[4720]: I0121 14:30:16.598324 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="651c91098a6b5beb1bb69833f5373a6ae3cd82dd60030a87f4a9c3ad1187b846" Jan 21 14:30:16 crc kubenswrapper[4720]: I0121 14:30:16.632861 4720 generic.go:334] "Generic (PLEG): container finished" podID="0389f8b8-4893-4619-b91f-0f2ef883fd85" containerID="780a7d93308f4ebd077233412e2b3ef3d2859a907091858637f7731a5de212e5" exitCode=0 Jan 21 14:30:16 crc kubenswrapper[4720]: I0121 14:30:16.633030 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0389f8b8-4893-4619-b91f-0f2ef883fd85","Type":"ContainerDied","Data":"780a7d93308f4ebd077233412e2b3ef3d2859a907091858637f7731a5de212e5"} Jan 21 14:30:17 crc kubenswrapper[4720]: I0121 14:30:17.122329 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:17 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:17 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:17 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:17 crc kubenswrapper[4720]: I0121 14:30:17.122386 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:17 crc kubenswrapper[4720]: I0121 14:30:17.175176 4720 patch_prober.go:28] interesting pod/console-f9d7485db-42g76 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Jan 21 14:30:17 crc kubenswrapper[4720]: I0121 14:30:17.175235 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-42g76" podUID="ac15d591-5558-4df9-b596-a1e27325bd6c" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Jan 21 14:30:17 crc kubenswrapper[4720]: I0121 14:30:17.175920 4720 patch_prober.go:28] interesting pod/downloads-7954f5f757-wmxb9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Jan 21 14:30:17 crc kubenswrapper[4720]: I0121 14:30:17.175953 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wmxb9" podUID="120bd3b2-5437-4a15-bcc4-32ae06eb7f1f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Jan 21 14:30:17 crc kubenswrapper[4720]: I0121 14:30:17.178083 4720 patch_prober.go:28] interesting pod/downloads-7954f5f757-wmxb9 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Jan 21 14:30:17 crc kubenswrapper[4720]: I0121 14:30:17.178108 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-wmxb9" podUID="120bd3b2-5437-4a15-bcc4-32ae06eb7f1f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Jan 21 14:30:17 crc kubenswrapper[4720]: I0121 14:30:17.707566 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:30:17 crc kubenswrapper[4720]: I0121 14:30:17.727776 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.061780 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.121533 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:18 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:18 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:18 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.121599 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.244437 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0389f8b8-4893-4619-b91f-0f2ef883fd85-kubelet-dir\") pod \"0389f8b8-4893-4619-b91f-0f2ef883fd85\" (UID: \"0389f8b8-4893-4619-b91f-0f2ef883fd85\") " Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.244513 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0389f8b8-4893-4619-b91f-0f2ef883fd85-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0389f8b8-4893-4619-b91f-0f2ef883fd85" (UID: "0389f8b8-4893-4619-b91f-0f2ef883fd85"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.244588 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0389f8b8-4893-4619-b91f-0f2ef883fd85-kube-api-access\") pod \"0389f8b8-4893-4619-b91f-0f2ef883fd85\" (UID: \"0389f8b8-4893-4619-b91f-0f2ef883fd85\") " Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.244996 4720 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0389f8b8-4893-4619-b91f-0f2ef883fd85-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.253922 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0389f8b8-4893-4619-b91f-0f2ef883fd85-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0389f8b8-4893-4619-b91f-0f2ef883fd85" (UID: "0389f8b8-4893-4619-b91f-0f2ef883fd85"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.347592 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0389f8b8-4893-4619-b91f-0f2ef883fd85-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.356460 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.359304 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.551681 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/38609d5a-a946-4abf-8b84-2e90a636844a-kubelet-dir\") pod \"38609d5a-a946-4abf-8b84-2e90a636844a\" (UID: \"38609d5a-a946-4abf-8b84-2e90a636844a\") " Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.551741 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/38609d5a-a946-4abf-8b84-2e90a636844a-kube-api-access\") pod \"38609d5a-a946-4abf-8b84-2e90a636844a\" (UID: \"38609d5a-a946-4abf-8b84-2e90a636844a\") " Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.553056 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/38609d5a-a946-4abf-8b84-2e90a636844a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "38609d5a-a946-4abf-8b84-2e90a636844a" (UID: "38609d5a-a946-4abf-8b84-2e90a636844a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.564169 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38609d5a-a946-4abf-8b84-2e90a636844a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "38609d5a-a946-4abf-8b84-2e90a636844a" (UID: "38609d5a-a946-4abf-8b84-2e90a636844a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:30:18 crc kubenswrapper[4720]: E0121 14:30:18.627397 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 21 14:30:18 crc kubenswrapper[4720]: E0121 14:30:18.635757 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 21 14:30:18 crc kubenswrapper[4720]: E0121 14:30:18.638962 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 21 14:30:18 crc kubenswrapper[4720]: E0121 14:30:18.639017 4720 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" podUID="75c0e088-7bdf-47f4-b434-b184e742d40a" containerName="kube-multus-additional-cni-plugins" Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.677536 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/38609d5a-a946-4abf-8b84-2e90a636844a-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.677570 4720 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/38609d5a-a946-4abf-8b84-2e90a636844a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.736725 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.758607 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0389f8b8-4893-4619-b91f-0f2ef883fd85","Type":"ContainerDied","Data":"bcd668709459a7d9349dcca3b2f18a9e0573e3a3a954d0f356bfcecf5d43778f"} Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.758873 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcd668709459a7d9349dcca3b2f18a9e0573e3a3a954d0f356bfcecf5d43778f" Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.807434 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.807797 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"38609d5a-a946-4abf-8b84-2e90a636844a","Type":"ContainerDied","Data":"75d81f40cb1aa80276ca163d8e80d21e159b813861ad79754529e04e661489ce"} Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.807826 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75d81f40cb1aa80276ca163d8e80d21e159b813861ad79754529e04e661489ce" Jan 21 14:30:19 crc kubenswrapper[4720]: I0121 14:30:19.121300 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:19 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:19 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:19 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:19 crc kubenswrapper[4720]: I0121 14:30:19.121346 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:20 crc kubenswrapper[4720]: I0121 14:30:20.121479 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:20 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:20 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:20 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:20 crc kubenswrapper[4720]: I0121 14:30:20.121526 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:21 crc kubenswrapper[4720]: I0121 14:30:21.121084 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:21 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:21 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:21 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:21 crc kubenswrapper[4720]: I0121 14:30:21.121397 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:22 crc kubenswrapper[4720]: I0121 14:30:22.121168 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:22 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:22 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:22 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:22 crc kubenswrapper[4720]: I0121 14:30:22.121248 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:23 crc kubenswrapper[4720]: I0121 14:30:23.122082 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:23 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:23 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:23 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:23 crc kubenswrapper[4720]: I0121 14:30:23.122163 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:24 crc kubenswrapper[4720]: I0121 14:30:24.121679 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:24 crc kubenswrapper[4720]: [+]has-synced ok Jan 21 14:30:24 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:24 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:24 crc kubenswrapper[4720]: I0121 14:30:24.121757 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:25 crc kubenswrapper[4720]: I0121 14:30:25.122899 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:30:25 crc kubenswrapper[4720]: I0121 14:30:25.128134 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:30:27 crc kubenswrapper[4720]: I0121 14:30:27.167874 4720 patch_prober.go:28] interesting pod/console-f9d7485db-42g76 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Jan 21 14:30:27 crc kubenswrapper[4720]: I0121 14:30:27.168206 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-42g76" podUID="ac15d591-5558-4df9-b596-a1e27325bd6c" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Jan 21 14:30:27 crc kubenswrapper[4720]: I0121 14:30:27.181244 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-wmxb9" Jan 21 14:30:28 crc kubenswrapper[4720]: E0121 14:30:28.599817 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 21 14:30:28 crc kubenswrapper[4720]: E0121 14:30:28.601337 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 21 14:30:28 crc kubenswrapper[4720]: E0121 14:30:28.604102 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 21 14:30:28 crc kubenswrapper[4720]: E0121 14:30:28.604151 4720 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" podUID="75c0e088-7bdf-47f4-b434-b184e742d40a" containerName="kube-multus-additional-cni-plugins" Jan 21 14:30:32 crc kubenswrapper[4720]: I0121 14:30:32.511844 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:37 crc kubenswrapper[4720]: I0121 14:30:37.144761 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 14:30:37 crc kubenswrapper[4720]: I0121 14:30:37.344438 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:30:37 crc kubenswrapper[4720]: I0121 14:30:37.358070 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:30:38 crc kubenswrapper[4720]: I0121 14:30:38.558035 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm55" Jan 21 14:30:38 crc kubenswrapper[4720]: E0121 14:30:38.593163 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 21 14:30:38 crc kubenswrapper[4720]: E0121 14:30:38.594548 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 21 14:30:38 crc kubenswrapper[4720]: E0121 14:30:38.595854 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 21 14:30:38 crc kubenswrapper[4720]: E0121 14:30:38.595915 4720 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" podUID="75c0e088-7bdf-47f4-b434-b184e742d40a" containerName="kube-multus-additional-cni-plugins" Jan 21 14:30:41 crc kubenswrapper[4720]: I0121 14:30:41.371049 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-nwj8k_75c0e088-7bdf-47f4-b434-b184e742d40a/kube-multus-additional-cni-plugins/0.log" Jan 21 14:30:41 crc kubenswrapper[4720]: I0121 14:30:41.371985 4720 generic.go:334] "Generic (PLEG): container finished" podID="75c0e088-7bdf-47f4-b434-b184e742d40a" containerID="316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" exitCode=137 Jan 21 14:30:41 crc kubenswrapper[4720]: I0121 14:30:41.372117 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" event={"ID":"75c0e088-7bdf-47f4-b434-b184e742d40a","Type":"ContainerDied","Data":"316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55"} Jan 21 14:30:46 crc kubenswrapper[4720]: I0121 14:30:46.701161 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 21 14:30:48 crc kubenswrapper[4720]: E0121 14:30:48.590677 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55 is running failed: container process not found" containerID="316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 21 14:30:48 crc kubenswrapper[4720]: E0121 14:30:48.592111 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55 is running failed: container process not found" containerID="316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 21 14:30:48 crc kubenswrapper[4720]: E0121 14:30:48.592446 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55 is running failed: container process not found" containerID="316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 21 14:30:48 crc kubenswrapper[4720]: E0121 14:30:48.592504 4720 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" podUID="75c0e088-7bdf-47f4-b434-b184e742d40a" containerName="kube-multus-additional-cni-plugins" Jan 21 14:30:51 crc kubenswrapper[4720]: I0121 14:30:51.282252 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 21 14:30:51 crc kubenswrapper[4720]: E0121 14:30:51.282830 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38609d5a-a946-4abf-8b84-2e90a636844a" containerName="pruner" Jan 21 14:30:51 crc kubenswrapper[4720]: I0121 14:30:51.282848 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="38609d5a-a946-4abf-8b84-2e90a636844a" containerName="pruner" Jan 21 14:30:51 crc kubenswrapper[4720]: E0121 14:30:51.282869 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c48951e9-42eb-461f-812e-adc413405821" containerName="collect-profiles" Jan 21 14:30:51 crc kubenswrapper[4720]: I0121 14:30:51.282875 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="c48951e9-42eb-461f-812e-adc413405821" containerName="collect-profiles" Jan 21 14:30:51 crc kubenswrapper[4720]: E0121 14:30:51.282887 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0389f8b8-4893-4619-b91f-0f2ef883fd85" containerName="pruner" Jan 21 14:30:51 crc kubenswrapper[4720]: I0121 14:30:51.282894 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="0389f8b8-4893-4619-b91f-0f2ef883fd85" containerName="pruner" Jan 21 14:30:51 crc kubenswrapper[4720]: I0121 14:30:51.283005 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="c48951e9-42eb-461f-812e-adc413405821" containerName="collect-profiles" Jan 21 14:30:51 crc kubenswrapper[4720]: I0121 14:30:51.283026 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="38609d5a-a946-4abf-8b84-2e90a636844a" containerName="pruner" Jan 21 14:30:51 crc kubenswrapper[4720]: I0121 14:30:51.283038 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="0389f8b8-4893-4619-b91f-0f2ef883fd85" containerName="pruner" Jan 21 14:30:51 crc kubenswrapper[4720]: I0121 14:30:51.283441 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 14:30:51 crc kubenswrapper[4720]: I0121 14:30:51.286460 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 21 14:30:51 crc kubenswrapper[4720]: I0121 14:30:51.286820 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 21 14:30:51 crc kubenswrapper[4720]: I0121 14:30:51.290757 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 21 14:30:51 crc kubenswrapper[4720]: I0121 14:30:51.319343 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=5.319321608 podStartE2EDuration="5.319321608s" podCreationTimestamp="2026-01-21 14:30:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:51.317268047 +0000 UTC m=+89.226007989" watchObservedRunningTime="2026-01-21 14:30:51.319321608 +0000 UTC m=+89.228061540" Jan 21 14:30:51 crc kubenswrapper[4720]: I0121 14:30:51.323868 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d8131be-bd51-4ed7-bb5c-57990adf304a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8d8131be-bd51-4ed7-bb5c-57990adf304a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 14:30:51 crc kubenswrapper[4720]: I0121 14:30:51.324009 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d8131be-bd51-4ed7-bb5c-57990adf304a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8d8131be-bd51-4ed7-bb5c-57990adf304a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 14:30:51 crc kubenswrapper[4720]: I0121 14:30:51.424781 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d8131be-bd51-4ed7-bb5c-57990adf304a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8d8131be-bd51-4ed7-bb5c-57990adf304a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 14:30:51 crc kubenswrapper[4720]: I0121 14:30:51.424849 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d8131be-bd51-4ed7-bb5c-57990adf304a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8d8131be-bd51-4ed7-bb5c-57990adf304a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 14:30:51 crc kubenswrapper[4720]: I0121 14:30:51.424925 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d8131be-bd51-4ed7-bb5c-57990adf304a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8d8131be-bd51-4ed7-bb5c-57990adf304a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 14:30:51 crc kubenswrapper[4720]: I0121 14:30:51.450505 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d8131be-bd51-4ed7-bb5c-57990adf304a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8d8131be-bd51-4ed7-bb5c-57990adf304a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 14:30:51 crc kubenswrapper[4720]: I0121 14:30:51.609994 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 14:30:56 crc kubenswrapper[4720]: I0121 14:30:56.271343 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 21 14:30:56 crc kubenswrapper[4720]: I0121 14:30:56.272477 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 21 14:30:56 crc kubenswrapper[4720]: I0121 14:30:56.285372 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d3bb0d67-7131-40e1-818d-5d4fd5c1a725-kube-api-access\") pod \"installer-9-crc\" (UID: \"d3bb0d67-7131-40e1-818d-5d4fd5c1a725\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 14:30:56 crc kubenswrapper[4720]: I0121 14:30:56.286151 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d3bb0d67-7131-40e1-818d-5d4fd5c1a725-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d3bb0d67-7131-40e1-818d-5d4fd5c1a725\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 14:30:56 crc kubenswrapper[4720]: I0121 14:30:56.286614 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d3bb0d67-7131-40e1-818d-5d4fd5c1a725-var-lock\") pod \"installer-9-crc\" (UID: \"d3bb0d67-7131-40e1-818d-5d4fd5c1a725\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 14:30:56 crc kubenswrapper[4720]: I0121 14:30:56.293168 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 21 14:30:56 crc kubenswrapper[4720]: I0121 14:30:56.388749 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d3bb0d67-7131-40e1-818d-5d4fd5c1a725-kube-api-access\") pod \"installer-9-crc\" (UID: \"d3bb0d67-7131-40e1-818d-5d4fd5c1a725\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 14:30:56 crc kubenswrapper[4720]: I0121 14:30:56.389238 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d3bb0d67-7131-40e1-818d-5d4fd5c1a725-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d3bb0d67-7131-40e1-818d-5d4fd5c1a725\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 14:30:56 crc kubenswrapper[4720]: I0121 14:30:56.389471 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d3bb0d67-7131-40e1-818d-5d4fd5c1a725-var-lock\") pod \"installer-9-crc\" (UID: \"d3bb0d67-7131-40e1-818d-5d4fd5c1a725\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 14:30:56 crc kubenswrapper[4720]: I0121 14:30:56.389358 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d3bb0d67-7131-40e1-818d-5d4fd5c1a725-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d3bb0d67-7131-40e1-818d-5d4fd5c1a725\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 14:30:56 crc kubenswrapper[4720]: I0121 14:30:56.389524 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d3bb0d67-7131-40e1-818d-5d4fd5c1a725-var-lock\") pod \"installer-9-crc\" (UID: \"d3bb0d67-7131-40e1-818d-5d4fd5c1a725\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 14:30:56 crc kubenswrapper[4720]: I0121 14:30:56.411403 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d3bb0d67-7131-40e1-818d-5d4fd5c1a725-kube-api-access\") pod \"installer-9-crc\" (UID: \"d3bb0d67-7131-40e1-818d-5d4fd5c1a725\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 14:30:56 crc kubenswrapper[4720]: I0121 14:30:56.601754 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 21 14:30:57 crc kubenswrapper[4720]: E0121 14:30:57.819876 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 21 14:30:57 crc kubenswrapper[4720]: E0121 14:30:57.820338 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-swm4t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-fwhvj_openshift-marketplace(d436685f-1f7d-454b-afa4-76389c5c5ff4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 21 14:30:57 crc kubenswrapper[4720]: E0121 14:30:57.821548 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-fwhvj" podUID="d436685f-1f7d-454b-afa4-76389c5c5ff4" Jan 21 14:30:58 crc kubenswrapper[4720]: E0121 14:30:58.590526 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55 is running failed: container process not found" containerID="316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 21 14:30:58 crc kubenswrapper[4720]: E0121 14:30:58.590939 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55 is running failed: container process not found" containerID="316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 21 14:30:58 crc kubenswrapper[4720]: E0121 14:30:58.591263 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55 is running failed: container process not found" containerID="316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 21 14:30:58 crc kubenswrapper[4720]: E0121 14:30:58.591321 4720 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" podUID="75c0e088-7bdf-47f4-b434-b184e742d40a" containerName="kube-multus-additional-cni-plugins" Jan 21 14:31:01 crc kubenswrapper[4720]: E0121 14:31:01.460178 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-fwhvj" podUID="d436685f-1f7d-454b-afa4-76389c5c5ff4" Jan 21 14:31:06 crc kubenswrapper[4720]: E0121 14:31:06.545214 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 21 14:31:06 crc kubenswrapper[4720]: E0121 14:31:06.545744 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4szdg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-c95rn_openshift-marketplace(8432f9d9-0168-4b49-b6a7-66281f46bd5a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 21 14:31:06 crc kubenswrapper[4720]: E0121 14:31:06.548243 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-c95rn" podUID="8432f9d9-0168-4b49-b6a7-66281f46bd5a" Jan 21 14:31:08 crc kubenswrapper[4720]: E0121 14:31:08.591024 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55 is running failed: container process not found" containerID="316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 21 14:31:08 crc kubenswrapper[4720]: E0121 14:31:08.591674 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55 is running failed: container process not found" containerID="316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 21 14:31:08 crc kubenswrapper[4720]: E0121 14:31:08.591897 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55 is running failed: container process not found" containerID="316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 21 14:31:08 crc kubenswrapper[4720]: E0121 14:31:08.591968 4720 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" podUID="75c0e088-7bdf-47f4-b434-b184e742d40a" containerName="kube-multus-additional-cni-plugins" Jan 21 14:31:09 crc kubenswrapper[4720]: E0121 14:31:09.201303 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 21 14:31:09 crc kubenswrapper[4720]: E0121 14:31:09.201484 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ql9d6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-jbtfr_openshift-marketplace(aa280405-236d-4a24-896d-04a2dfad8a3a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 21 14:31:09 crc kubenswrapper[4720]: E0121 14:31:09.202711 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-jbtfr" podUID="aa280405-236d-4a24-896d-04a2dfad8a3a" Jan 21 14:31:10 crc kubenswrapper[4720]: E0121 14:31:10.872749 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-jbtfr" podUID="aa280405-236d-4a24-896d-04a2dfad8a3a" Jan 21 14:31:10 crc kubenswrapper[4720]: E0121 14:31:10.873368 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-c95rn" podUID="8432f9d9-0168-4b49-b6a7-66281f46bd5a" Jan 21 14:31:13 crc kubenswrapper[4720]: E0121 14:31:13.265622 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 21 14:31:13 crc kubenswrapper[4720]: E0121 14:31:13.265816 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dmfz4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-v6vwc_openshift-marketplace(1d6131a5-b63e-42a5-905a-9ed5350a421a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 21 14:31:13 crc kubenswrapper[4720]: E0121 14:31:13.267271 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-v6vwc" podUID="1d6131a5-b63e-42a5-905a-9ed5350a421a" Jan 21 14:31:13 crc kubenswrapper[4720]: E0121 14:31:13.327936 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 21 14:31:13 crc kubenswrapper[4720]: E0121 14:31:13.328136 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kxcbp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-lt46m_openshift-marketplace(7bb4c793-0d05-43f9-a9ad-30d9b6b40595): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 21 14:31:13 crc kubenswrapper[4720]: E0121 14:31:13.329397 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-lt46m" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" Jan 21 14:31:13 crc kubenswrapper[4720]: E0121 14:31:13.428558 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 21 14:31:13 crc kubenswrapper[4720]: E0121 14:31:13.428709 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sfn9s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-5qbdf_openshift-marketplace(4bbb0e48-d287-42fc-a165-86038d2083c9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 21 14:31:13 crc kubenswrapper[4720]: E0121 14:31:13.430119 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-5qbdf" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" Jan 21 14:31:15 crc kubenswrapper[4720]: E0121 14:31:15.690313 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-lt46m" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" Jan 21 14:31:15 crc kubenswrapper[4720]: E0121 14:31:15.690646 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-5qbdf" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" Jan 21 14:31:15 crc kubenswrapper[4720]: E0121 14:31:15.709342 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-v6vwc" podUID="1d6131a5-b63e-42a5-905a-9ed5350a421a" Jan 21 14:31:15 crc kubenswrapper[4720]: E0121 14:31:15.731787 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 21 14:31:15 crc kubenswrapper[4720]: E0121 14:31:15.731954 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mmzpc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-x7575_openshift-marketplace(328ecaa4-59eb-4707-a320-245636d0c778): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 21 14:31:15 crc kubenswrapper[4720]: E0121 14:31:15.733978 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-x7575" podUID="328ecaa4-59eb-4707-a320-245636d0c778" Jan 21 14:31:15 crc kubenswrapper[4720]: E0121 14:31:15.763790 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 21 14:31:15 crc kubenswrapper[4720]: E0121 14:31:15.763901 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t4pfg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-52n8k_openshift-marketplace(306f9668-a044-448f-a14f-81c9726d3008): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 21 14:31:15 crc kubenswrapper[4720]: E0121 14:31:15.765165 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-52n8k" podUID="306f9668-a044-448f-a14f-81c9726d3008" Jan 21 14:31:15 crc kubenswrapper[4720]: I0121 14:31:15.765876 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-nwj8k_75c0e088-7bdf-47f4-b434-b184e742d40a/kube-multus-additional-cni-plugins/0.log" Jan 21 14:31:15 crc kubenswrapper[4720]: I0121 14:31:15.765929 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" Jan 21 14:31:15 crc kubenswrapper[4720]: I0121 14:31:15.951139 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/75c0e088-7bdf-47f4-b434-b184e742d40a-ready\") pod \"75c0e088-7bdf-47f4-b434-b184e742d40a\" (UID: \"75c0e088-7bdf-47f4-b434-b184e742d40a\") " Jan 21 14:31:15 crc kubenswrapper[4720]: I0121 14:31:15.951517 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/75c0e088-7bdf-47f4-b434-b184e742d40a-cni-sysctl-allowlist\") pod \"75c0e088-7bdf-47f4-b434-b184e742d40a\" (UID: \"75c0e088-7bdf-47f4-b434-b184e742d40a\") " Jan 21 14:31:15 crc kubenswrapper[4720]: I0121 14:31:15.951555 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cdm8\" (UniqueName: \"kubernetes.io/projected/75c0e088-7bdf-47f4-b434-b184e742d40a-kube-api-access-2cdm8\") pod \"75c0e088-7bdf-47f4-b434-b184e742d40a\" (UID: \"75c0e088-7bdf-47f4-b434-b184e742d40a\") " Jan 21 14:31:15 crc kubenswrapper[4720]: I0121 14:31:15.951575 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/75c0e088-7bdf-47f4-b434-b184e742d40a-tuning-conf-dir\") pod \"75c0e088-7bdf-47f4-b434-b184e742d40a\" (UID: \"75c0e088-7bdf-47f4-b434-b184e742d40a\") " Jan 21 14:31:15 crc kubenswrapper[4720]: I0121 14:31:15.951801 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/75c0e088-7bdf-47f4-b434-b184e742d40a-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "75c0e088-7bdf-47f4-b434-b184e742d40a" (UID: "75c0e088-7bdf-47f4-b434-b184e742d40a"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:31:15 crc kubenswrapper[4720]: I0121 14:31:15.951910 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75c0e088-7bdf-47f4-b434-b184e742d40a-ready" (OuterVolumeSpecName: "ready") pod "75c0e088-7bdf-47f4-b434-b184e742d40a" (UID: "75c0e088-7bdf-47f4-b434-b184e742d40a"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:31:15 crc kubenswrapper[4720]: I0121 14:31:15.952054 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75c0e088-7bdf-47f4-b434-b184e742d40a-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "75c0e088-7bdf-47f4-b434-b184e742d40a" (UID: "75c0e088-7bdf-47f4-b434-b184e742d40a"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:31:15 crc kubenswrapper[4720]: I0121 14:31:15.961381 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75c0e088-7bdf-47f4-b434-b184e742d40a-kube-api-access-2cdm8" (OuterVolumeSpecName: "kube-api-access-2cdm8") pod "75c0e088-7bdf-47f4-b434-b184e742d40a" (UID: "75c0e088-7bdf-47f4-b434-b184e742d40a"). InnerVolumeSpecName "kube-api-access-2cdm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:31:16 crc kubenswrapper[4720]: I0121 14:31:16.052590 4720 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/75c0e088-7bdf-47f4-b434-b184e742d40a-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 21 14:31:16 crc kubenswrapper[4720]: I0121 14:31:16.052637 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cdm8\" (UniqueName: \"kubernetes.io/projected/75c0e088-7bdf-47f4-b434-b184e742d40a-kube-api-access-2cdm8\") on node \"crc\" DevicePath \"\"" Jan 21 14:31:16 crc kubenswrapper[4720]: I0121 14:31:16.052646 4720 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/75c0e088-7bdf-47f4-b434-b184e742d40a-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Jan 21 14:31:16 crc kubenswrapper[4720]: I0121 14:31:16.052675 4720 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/75c0e088-7bdf-47f4-b434-b184e742d40a-ready\") on node \"crc\" DevicePath \"\"" Jan 21 14:31:16 crc kubenswrapper[4720]: I0121 14:31:16.149006 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 21 14:31:16 crc kubenswrapper[4720]: W0121 14:31:16.160219 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd3bb0d67_7131_40e1_818d_5d4fd5c1a725.slice/crio-ca5672ad37f1aef198f44d464304b97b303638dc1b1a0c650de2bde69ba4a59a WatchSource:0}: Error finding container ca5672ad37f1aef198f44d464304b97b303638dc1b1a0c650de2bde69ba4a59a: Status 404 returned error can't find the container with id ca5672ad37f1aef198f44d464304b97b303638dc1b1a0c650de2bde69ba4a59a Jan 21 14:31:16 crc kubenswrapper[4720]: I0121 14:31:16.170588 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 21 14:31:16 crc kubenswrapper[4720]: W0121 14:31:16.186123 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8d8131be_bd51_4ed7_bb5c_57990adf304a.slice/crio-f4b4b3b818f563e853029353849c777c7f1d1e47810bde0bf68f24904d8312ce WatchSource:0}: Error finding container f4b4b3b818f563e853029353849c777c7f1d1e47810bde0bf68f24904d8312ce: Status 404 returned error can't find the container with id f4b4b3b818f563e853029353849c777c7f1d1e47810bde0bf68f24904d8312ce Jan 21 14:31:16 crc kubenswrapper[4720]: I0121 14:31:16.213894 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-nwj8k_75c0e088-7bdf-47f4-b434-b184e742d40a/kube-multus-additional-cni-plugins/0.log" Jan 21 14:31:16 crc kubenswrapper[4720]: I0121 14:31:16.213955 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" event={"ID":"75c0e088-7bdf-47f4-b434-b184e742d40a","Type":"ContainerDied","Data":"606f33407deb43968f7cc7f66c83d922a2e45672a6ac0cad952ee6a566842321"} Jan 21 14:31:16 crc kubenswrapper[4720]: I0121 14:31:16.213988 4720 scope.go:117] "RemoveContainer" containerID="316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" Jan 21 14:31:16 crc kubenswrapper[4720]: I0121 14:31:16.214074 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" Jan 21 14:31:16 crc kubenswrapper[4720]: I0121 14:31:16.217982 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d3bb0d67-7131-40e1-818d-5d4fd5c1a725","Type":"ContainerStarted","Data":"ca5672ad37f1aef198f44d464304b97b303638dc1b1a0c650de2bde69ba4a59a"} Jan 21 14:31:16 crc kubenswrapper[4720]: I0121 14:31:16.219586 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"8d8131be-bd51-4ed7-bb5c-57990adf304a","Type":"ContainerStarted","Data":"f4b4b3b818f563e853029353849c777c7f1d1e47810bde0bf68f24904d8312ce"} Jan 21 14:31:16 crc kubenswrapper[4720]: E0121 14:31:16.221262 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-x7575" podUID="328ecaa4-59eb-4707-a320-245636d0c778" Jan 21 14:31:16 crc kubenswrapper[4720]: E0121 14:31:16.221393 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-52n8k" podUID="306f9668-a044-448f-a14f-81c9726d3008" Jan 21 14:31:16 crc kubenswrapper[4720]: I0121 14:31:16.255644 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-nwj8k"] Jan 21 14:31:16 crc kubenswrapper[4720]: I0121 14:31:16.259593 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-nwj8k"] Jan 21 14:31:16 crc kubenswrapper[4720]: I0121 14:31:16.686054 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75c0e088-7bdf-47f4-b434-b184e742d40a" path="/var/lib/kubelet/pods/75c0e088-7bdf-47f4-b434-b184e742d40a/volumes" Jan 21 14:31:17 crc kubenswrapper[4720]: I0121 14:31:17.226557 4720 generic.go:334] "Generic (PLEG): container finished" podID="8d8131be-bd51-4ed7-bb5c-57990adf304a" containerID="2512271d020bfe9083ce97421060dd72da178b6e1eacc8d10a11852e7a71fefd" exitCode=0 Jan 21 14:31:17 crc kubenswrapper[4720]: I0121 14:31:17.226604 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"8d8131be-bd51-4ed7-bb5c-57990adf304a","Type":"ContainerDied","Data":"2512271d020bfe9083ce97421060dd72da178b6e1eacc8d10a11852e7a71fefd"} Jan 21 14:31:17 crc kubenswrapper[4720]: I0121 14:31:17.233557 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d3bb0d67-7131-40e1-818d-5d4fd5c1a725","Type":"ContainerStarted","Data":"7ae518d6f1ac52dac7a894b823c50d52751d81a944e32a3cdcc1dc5e572fb00e"} Jan 21 14:31:17 crc kubenswrapper[4720]: I0121 14:31:17.263406 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=21.263385149 podStartE2EDuration="21.263385149s" podCreationTimestamp="2026-01-21 14:30:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:31:17.259519534 +0000 UTC m=+115.168259486" watchObservedRunningTime="2026-01-21 14:31:17.263385149 +0000 UTC m=+115.172125101" Jan 21 14:31:18 crc kubenswrapper[4720]: I0121 14:31:18.241496 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fwhvj" event={"ID":"d436685f-1f7d-454b-afa4-76389c5c5ff4","Type":"ContainerStarted","Data":"0f822c28937b4fe73524abfab4f4eed108c52c489cfad83961a00b3c3e26a739"} Jan 21 14:31:18 crc kubenswrapper[4720]: I0121 14:31:18.513859 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 14:31:18 crc kubenswrapper[4720]: I0121 14:31:18.587563 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d8131be-bd51-4ed7-bb5c-57990adf304a-kubelet-dir\") pod \"8d8131be-bd51-4ed7-bb5c-57990adf304a\" (UID: \"8d8131be-bd51-4ed7-bb5c-57990adf304a\") " Jan 21 14:31:18 crc kubenswrapper[4720]: I0121 14:31:18.587694 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d8131be-bd51-4ed7-bb5c-57990adf304a-kube-api-access\") pod \"8d8131be-bd51-4ed7-bb5c-57990adf304a\" (UID: \"8d8131be-bd51-4ed7-bb5c-57990adf304a\") " Jan 21 14:31:18 crc kubenswrapper[4720]: I0121 14:31:18.587742 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8d8131be-bd51-4ed7-bb5c-57990adf304a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8d8131be-bd51-4ed7-bb5c-57990adf304a" (UID: "8d8131be-bd51-4ed7-bb5c-57990adf304a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:31:18 crc kubenswrapper[4720]: I0121 14:31:18.587877 4720 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d8131be-bd51-4ed7-bb5c-57990adf304a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 21 14:31:18 crc kubenswrapper[4720]: I0121 14:31:18.596472 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d8131be-bd51-4ed7-bb5c-57990adf304a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8d8131be-bd51-4ed7-bb5c-57990adf304a" (UID: "8d8131be-bd51-4ed7-bb5c-57990adf304a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:31:18 crc kubenswrapper[4720]: I0121 14:31:18.689682 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d8131be-bd51-4ed7-bb5c-57990adf304a-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 21 14:31:19 crc kubenswrapper[4720]: I0121 14:31:19.249219 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"8d8131be-bd51-4ed7-bb5c-57990adf304a","Type":"ContainerDied","Data":"f4b4b3b818f563e853029353849c777c7f1d1e47810bde0bf68f24904d8312ce"} Jan 21 14:31:19 crc kubenswrapper[4720]: I0121 14:31:19.249904 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4b4b3b818f563e853029353849c777c7f1d1e47810bde0bf68f24904d8312ce" Jan 21 14:31:19 crc kubenswrapper[4720]: I0121 14:31:19.249214 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 14:31:19 crc kubenswrapper[4720]: I0121 14:31:19.254039 4720 generic.go:334] "Generic (PLEG): container finished" podID="d436685f-1f7d-454b-afa4-76389c5c5ff4" containerID="0f822c28937b4fe73524abfab4f4eed108c52c489cfad83961a00b3c3e26a739" exitCode=0 Jan 21 14:31:19 crc kubenswrapper[4720]: I0121 14:31:19.254129 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fwhvj" event={"ID":"d436685f-1f7d-454b-afa4-76389c5c5ff4","Type":"ContainerDied","Data":"0f822c28937b4fe73524abfab4f4eed108c52c489cfad83961a00b3c3e26a739"} Jan 21 14:31:20 crc kubenswrapper[4720]: I0121 14:31:20.261027 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fwhvj" event={"ID":"d436685f-1f7d-454b-afa4-76389c5c5ff4","Type":"ContainerStarted","Data":"87a9c41acb6766e559f32afd2cc2050647f1ed415f0920e717d72e495a698567"} Jan 21 14:31:21 crc kubenswrapper[4720]: I0121 14:31:21.694725 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fwhvj" podStartSLOduration=4.73684317 podStartE2EDuration="1m15.694704063s" podCreationTimestamp="2026-01-21 14:30:06 +0000 UTC" firstStartedPulling="2026-01-21 14:30:08.967842637 +0000 UTC m=+46.876582569" lastFinishedPulling="2026-01-21 14:31:19.92570353 +0000 UTC m=+117.834443462" observedRunningTime="2026-01-21 14:31:20.283848153 +0000 UTC m=+118.192588145" watchObservedRunningTime="2026-01-21 14:31:21.694704063 +0000 UTC m=+119.603444005" Jan 21 14:31:24 crc kubenswrapper[4720]: I0121 14:31:24.283507 4720 generic.go:334] "Generic (PLEG): container finished" podID="aa280405-236d-4a24-896d-04a2dfad8a3a" containerID="d984e2a72a467830151f365cfcb9b92dbb5585bf3537764c63136e4213e8bf37" exitCode=0 Jan 21 14:31:24 crc kubenswrapper[4720]: I0121 14:31:24.283560 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jbtfr" event={"ID":"aa280405-236d-4a24-896d-04a2dfad8a3a","Type":"ContainerDied","Data":"d984e2a72a467830151f365cfcb9b92dbb5585bf3537764c63136e4213e8bf37"} Jan 21 14:31:24 crc kubenswrapper[4720]: I0121 14:31:24.292257 4720 generic.go:334] "Generic (PLEG): container finished" podID="8432f9d9-0168-4b49-b6a7-66281f46bd5a" containerID="0c8a7ca936259535e52d9f1f75585d2ac601a4266b11a725dd5c872f792d1b98" exitCode=0 Jan 21 14:31:24 crc kubenswrapper[4720]: I0121 14:31:24.292309 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c95rn" event={"ID":"8432f9d9-0168-4b49-b6a7-66281f46bd5a","Type":"ContainerDied","Data":"0c8a7ca936259535e52d9f1f75585d2ac601a4266b11a725dd5c872f792d1b98"} Jan 21 14:31:26 crc kubenswrapper[4720]: I0121 14:31:26.308592 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c95rn" event={"ID":"8432f9d9-0168-4b49-b6a7-66281f46bd5a","Type":"ContainerStarted","Data":"eb2ea66de0c15f757e1ea3f3a7ba866b91eccc8717ebe3d5a9ce44767b9405dc"} Jan 21 14:31:26 crc kubenswrapper[4720]: I0121 14:31:26.311691 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jbtfr" event={"ID":"aa280405-236d-4a24-896d-04a2dfad8a3a","Type":"ContainerStarted","Data":"726833c5340eadfc726a17be564e29850dafa50a93c6d0722e08950ba9a91f74"} Jan 21 14:31:26 crc kubenswrapper[4720]: I0121 14:31:26.330185 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-c95rn" podStartSLOduration=4.092685237 podStartE2EDuration="1m19.330161719s" podCreationTimestamp="2026-01-21 14:30:07 +0000 UTC" firstStartedPulling="2026-01-21 14:30:10.014396876 +0000 UTC m=+47.923136798" lastFinishedPulling="2026-01-21 14:31:25.251873348 +0000 UTC m=+123.160613280" observedRunningTime="2026-01-21 14:31:26.326230872 +0000 UTC m=+124.234970804" watchObservedRunningTime="2026-01-21 14:31:26.330161719 +0000 UTC m=+124.238901661" Jan 21 14:31:26 crc kubenswrapper[4720]: I0121 14:31:26.347290 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jbtfr" podStartSLOduration=3.058653801 podStartE2EDuration="1m18.347251269s" podCreationTimestamp="2026-01-21 14:30:08 +0000 UTC" firstStartedPulling="2026-01-21 14:30:09.986444672 +0000 UTC m=+47.895184604" lastFinishedPulling="2026-01-21 14:31:25.27504214 +0000 UTC m=+123.183782072" observedRunningTime="2026-01-21 14:31:26.344465306 +0000 UTC m=+124.253205248" watchObservedRunningTime="2026-01-21 14:31:26.347251269 +0000 UTC m=+124.255991201" Jan 21 14:31:26 crc kubenswrapper[4720]: I0121 14:31:26.489253 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fwhvj" Jan 21 14:31:26 crc kubenswrapper[4720]: I0121 14:31:26.489521 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fwhvj" Jan 21 14:31:26 crc kubenswrapper[4720]: I0121 14:31:26.598921 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fwhvj" Jan 21 14:31:27 crc kubenswrapper[4720]: I0121 14:31:27.354251 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fwhvj" Jan 21 14:31:28 crc kubenswrapper[4720]: I0121 14:31:28.367488 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-c95rn" Jan 21 14:31:28 crc kubenswrapper[4720]: I0121 14:31:28.367648 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-c95rn" Jan 21 14:31:28 crc kubenswrapper[4720]: I0121 14:31:28.427550 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-c95rn" Jan 21 14:31:28 crc kubenswrapper[4720]: I0121 14:31:28.638557 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jbtfr" Jan 21 14:31:28 crc kubenswrapper[4720]: I0121 14:31:28.638689 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jbtfr" Jan 21 14:31:28 crc kubenswrapper[4720]: I0121 14:31:28.674039 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jbtfr" Jan 21 14:31:28 crc kubenswrapper[4720]: I0121 14:31:28.831238 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fwhvj"] Jan 21 14:31:29 crc kubenswrapper[4720]: I0121 14:31:29.329078 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x7575" event={"ID":"328ecaa4-59eb-4707-a320-245636d0c778","Type":"ContainerStarted","Data":"ea8b6d61a3c6cf5240b3168f98e11a3daa5ea4b4ec1019ccba81cc77ae7c8c62"} Jan 21 14:31:30 crc kubenswrapper[4720]: I0121 14:31:30.334616 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fwhvj" podUID="d436685f-1f7d-454b-afa4-76389c5c5ff4" containerName="registry-server" containerID="cri-o://87a9c41acb6766e559f32afd2cc2050647f1ed415f0920e717d72e495a698567" gracePeriod=2 Jan 21 14:31:32 crc kubenswrapper[4720]: I0121 14:31:32.343117 4720 generic.go:334] "Generic (PLEG): container finished" podID="328ecaa4-59eb-4707-a320-245636d0c778" containerID="ea8b6d61a3c6cf5240b3168f98e11a3daa5ea4b4ec1019ccba81cc77ae7c8c62" exitCode=0 Jan 21 14:31:32 crc kubenswrapper[4720]: I0121 14:31:32.343202 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x7575" event={"ID":"328ecaa4-59eb-4707-a320-245636d0c778","Type":"ContainerDied","Data":"ea8b6d61a3c6cf5240b3168f98e11a3daa5ea4b4ec1019ccba81cc77ae7c8c62"} Jan 21 14:31:33 crc kubenswrapper[4720]: I0121 14:31:33.350353 4720 generic.go:334] "Generic (PLEG): container finished" podID="d436685f-1f7d-454b-afa4-76389c5c5ff4" containerID="87a9c41acb6766e559f32afd2cc2050647f1ed415f0920e717d72e495a698567" exitCode=0 Jan 21 14:31:33 crc kubenswrapper[4720]: I0121 14:31:33.350398 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fwhvj" event={"ID":"d436685f-1f7d-454b-afa4-76389c5c5ff4","Type":"ContainerDied","Data":"87a9c41acb6766e559f32afd2cc2050647f1ed415f0920e717d72e495a698567"} Jan 21 14:31:34 crc kubenswrapper[4720]: I0121 14:31:34.453607 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fwhvj" Jan 21 14:31:34 crc kubenswrapper[4720]: I0121 14:31:34.486544 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d436685f-1f7d-454b-afa4-76389c5c5ff4-utilities\") pod \"d436685f-1f7d-454b-afa4-76389c5c5ff4\" (UID: \"d436685f-1f7d-454b-afa4-76389c5c5ff4\") " Jan 21 14:31:34 crc kubenswrapper[4720]: I0121 14:31:34.486603 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swm4t\" (UniqueName: \"kubernetes.io/projected/d436685f-1f7d-454b-afa4-76389c5c5ff4-kube-api-access-swm4t\") pod \"d436685f-1f7d-454b-afa4-76389c5c5ff4\" (UID: \"d436685f-1f7d-454b-afa4-76389c5c5ff4\") " Jan 21 14:31:34 crc kubenswrapper[4720]: I0121 14:31:34.486668 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d436685f-1f7d-454b-afa4-76389c5c5ff4-catalog-content\") pod \"d436685f-1f7d-454b-afa4-76389c5c5ff4\" (UID: \"d436685f-1f7d-454b-afa4-76389c5c5ff4\") " Jan 21 14:31:34 crc kubenswrapper[4720]: I0121 14:31:34.488194 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d436685f-1f7d-454b-afa4-76389c5c5ff4-utilities" (OuterVolumeSpecName: "utilities") pod "d436685f-1f7d-454b-afa4-76389c5c5ff4" (UID: "d436685f-1f7d-454b-afa4-76389c5c5ff4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:31:34 crc kubenswrapper[4720]: I0121 14:31:34.493983 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d436685f-1f7d-454b-afa4-76389c5c5ff4-kube-api-access-swm4t" (OuterVolumeSpecName: "kube-api-access-swm4t") pod "d436685f-1f7d-454b-afa4-76389c5c5ff4" (UID: "d436685f-1f7d-454b-afa4-76389c5c5ff4"). InnerVolumeSpecName "kube-api-access-swm4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:31:34 crc kubenswrapper[4720]: I0121 14:31:34.544000 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d436685f-1f7d-454b-afa4-76389c5c5ff4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d436685f-1f7d-454b-afa4-76389c5c5ff4" (UID: "d436685f-1f7d-454b-afa4-76389c5c5ff4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:31:34 crc kubenswrapper[4720]: I0121 14:31:34.587495 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d436685f-1f7d-454b-afa4-76389c5c5ff4-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 14:31:34 crc kubenswrapper[4720]: I0121 14:31:34.587766 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swm4t\" (UniqueName: \"kubernetes.io/projected/d436685f-1f7d-454b-afa4-76389c5c5ff4-kube-api-access-swm4t\") on node \"crc\" DevicePath \"\"" Jan 21 14:31:34 crc kubenswrapper[4720]: I0121 14:31:34.587862 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d436685f-1f7d-454b-afa4-76389c5c5ff4-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 14:31:35 crc kubenswrapper[4720]: I0121 14:31:35.364887 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fwhvj" event={"ID":"d436685f-1f7d-454b-afa4-76389c5c5ff4","Type":"ContainerDied","Data":"f92665f685bf80e17e3e48269da656cf92cd51a8b00c063a085e7a0052993aa3"} Jan 21 14:31:35 crc kubenswrapper[4720]: I0121 14:31:35.365220 4720 scope.go:117] "RemoveContainer" containerID="87a9c41acb6766e559f32afd2cc2050647f1ed415f0920e717d72e495a698567" Jan 21 14:31:35 crc kubenswrapper[4720]: I0121 14:31:35.365007 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fwhvj" Jan 21 14:31:35 crc kubenswrapper[4720]: I0121 14:31:35.382234 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fwhvj"] Jan 21 14:31:35 crc kubenswrapper[4720]: I0121 14:31:35.384859 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fwhvj"] Jan 21 14:31:35 crc kubenswrapper[4720]: I0121 14:31:35.395333 4720 scope.go:117] "RemoveContainer" containerID="0f822c28937b4fe73524abfab4f4eed108c52c489cfad83961a00b3c3e26a739" Jan 21 14:31:35 crc kubenswrapper[4720]: I0121 14:31:35.422381 4720 scope.go:117] "RemoveContainer" containerID="fea576e42ea53daf64f9e355cf2971b7c48351b927096e3397ea48c46de4d07f" Jan 21 14:31:36 crc kubenswrapper[4720]: I0121 14:31:36.685045 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d436685f-1f7d-454b-afa4-76389c5c5ff4" path="/var/lib/kubelet/pods/d436685f-1f7d-454b-afa4-76389c5c5ff4/volumes" Jan 21 14:31:38 crc kubenswrapper[4720]: I0121 14:31:38.405308 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-c95rn" Jan 21 14:31:38 crc kubenswrapper[4720]: I0121 14:31:38.676043 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jbtfr" Jan 21 14:31:39 crc kubenswrapper[4720]: I0121 14:31:39.618558 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lt46m" event={"ID":"7bb4c793-0d05-43f9-a9ad-30d9b6b40595","Type":"ContainerStarted","Data":"828c55378e558356171a9771b0f3cab050cb198f63a03e622439dc4e677f234d"} Jan 21 14:31:39 crc kubenswrapper[4720]: I0121 14:31:39.623920 4720 generic.go:334] "Generic (PLEG): container finished" podID="1d6131a5-b63e-42a5-905a-9ed5350a421a" containerID="de24d7572be1ab2e8506e6c1275ce9f8ebc0d3feb9e5d97e950e10f251941efb" exitCode=0 Jan 21 14:31:39 crc kubenswrapper[4720]: I0121 14:31:39.624179 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v6vwc" event={"ID":"1d6131a5-b63e-42a5-905a-9ed5350a421a","Type":"ContainerDied","Data":"de24d7572be1ab2e8506e6c1275ce9f8ebc0d3feb9e5d97e950e10f251941efb"} Jan 21 14:31:39 crc kubenswrapper[4720]: I0121 14:31:39.628453 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5qbdf" event={"ID":"4bbb0e48-d287-42fc-a165-86038d2083c9","Type":"ContainerStarted","Data":"d6271eba8343a64a510743b92076c0b8cbf3dfa8ac7bd3c494ffad17e390a604"} Jan 21 14:31:39 crc kubenswrapper[4720]: I0121 14:31:39.632158 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x7575" event={"ID":"328ecaa4-59eb-4707-a320-245636d0c778","Type":"ContainerStarted","Data":"0d492e75765297f2fad45ffe41414d6dbafe8642c6bee06687ddf6c8715a19bf"} Jan 21 14:31:39 crc kubenswrapper[4720]: I0121 14:31:39.644690 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-52n8k" event={"ID":"306f9668-a044-448f-a14f-81c9726d3008","Type":"ContainerStarted","Data":"359803a342c5c510fb51706cab89d859016c20be09a4df27bb7da03e276e9272"} Jan 21 14:31:39 crc kubenswrapper[4720]: I0121 14:31:39.760265 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-x7575" podStartSLOduration=5.21038437 podStartE2EDuration="1m31.760243675s" podCreationTimestamp="2026-01-21 14:30:08 +0000 UTC" firstStartedPulling="2026-01-21 14:30:12.274929318 +0000 UTC m=+50.183669250" lastFinishedPulling="2026-01-21 14:31:38.824788583 +0000 UTC m=+136.733528555" observedRunningTime="2026-01-21 14:31:39.723712446 +0000 UTC m=+137.632452408" watchObservedRunningTime="2026-01-21 14:31:39.760243675 +0000 UTC m=+137.668983607" Jan 21 14:31:40 crc kubenswrapper[4720]: I0121 14:31:40.630994 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jbtfr"] Jan 21 14:31:40 crc kubenswrapper[4720]: I0121 14:31:40.631218 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jbtfr" podUID="aa280405-236d-4a24-896d-04a2dfad8a3a" containerName="registry-server" containerID="cri-o://726833c5340eadfc726a17be564e29850dafa50a93c6d0722e08950ba9a91f74" gracePeriod=2 Jan 21 14:31:40 crc kubenswrapper[4720]: I0121 14:31:40.691945 4720 generic.go:334] "Generic (PLEG): container finished" podID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" containerID="828c55378e558356171a9771b0f3cab050cb198f63a03e622439dc4e677f234d" exitCode=0 Jan 21 14:31:40 crc kubenswrapper[4720]: I0121 14:31:40.692192 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lt46m" event={"ID":"7bb4c793-0d05-43f9-a9ad-30d9b6b40595","Type":"ContainerDied","Data":"828c55378e558356171a9771b0f3cab050cb198f63a03e622439dc4e677f234d"} Jan 21 14:31:40 crc kubenswrapper[4720]: I0121 14:31:40.697264 4720 generic.go:334] "Generic (PLEG): container finished" podID="4bbb0e48-d287-42fc-a165-86038d2083c9" containerID="d6271eba8343a64a510743b92076c0b8cbf3dfa8ac7bd3c494ffad17e390a604" exitCode=0 Jan 21 14:31:40 crc kubenswrapper[4720]: I0121 14:31:40.697319 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5qbdf" event={"ID":"4bbb0e48-d287-42fc-a165-86038d2083c9","Type":"ContainerDied","Data":"d6271eba8343a64a510743b92076c0b8cbf3dfa8ac7bd3c494ffad17e390a604"} Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.510544 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jbtfr" Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.594648 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa280405-236d-4a24-896d-04a2dfad8a3a-utilities\") pod \"aa280405-236d-4a24-896d-04a2dfad8a3a\" (UID: \"aa280405-236d-4a24-896d-04a2dfad8a3a\") " Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.594708 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ql9d6\" (UniqueName: \"kubernetes.io/projected/aa280405-236d-4a24-896d-04a2dfad8a3a-kube-api-access-ql9d6\") pod \"aa280405-236d-4a24-896d-04a2dfad8a3a\" (UID: \"aa280405-236d-4a24-896d-04a2dfad8a3a\") " Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.594734 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa280405-236d-4a24-896d-04a2dfad8a3a-catalog-content\") pod \"aa280405-236d-4a24-896d-04a2dfad8a3a\" (UID: \"aa280405-236d-4a24-896d-04a2dfad8a3a\") " Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.595711 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa280405-236d-4a24-896d-04a2dfad8a3a-utilities" (OuterVolumeSpecName: "utilities") pod "aa280405-236d-4a24-896d-04a2dfad8a3a" (UID: "aa280405-236d-4a24-896d-04a2dfad8a3a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.609849 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa280405-236d-4a24-896d-04a2dfad8a3a-kube-api-access-ql9d6" (OuterVolumeSpecName: "kube-api-access-ql9d6") pod "aa280405-236d-4a24-896d-04a2dfad8a3a" (UID: "aa280405-236d-4a24-896d-04a2dfad8a3a"). InnerVolumeSpecName "kube-api-access-ql9d6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.636395 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa280405-236d-4a24-896d-04a2dfad8a3a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aa280405-236d-4a24-896d-04a2dfad8a3a" (UID: "aa280405-236d-4a24-896d-04a2dfad8a3a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.695789 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa280405-236d-4a24-896d-04a2dfad8a3a-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.695834 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ql9d6\" (UniqueName: \"kubernetes.io/projected/aa280405-236d-4a24-896d-04a2dfad8a3a-kube-api-access-ql9d6\") on node \"crc\" DevicePath \"\"" Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.695849 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa280405-236d-4a24-896d-04a2dfad8a3a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.718322 4720 generic.go:334] "Generic (PLEG): container finished" podID="aa280405-236d-4a24-896d-04a2dfad8a3a" containerID="726833c5340eadfc726a17be564e29850dafa50a93c6d0722e08950ba9a91f74" exitCode=0 Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.718387 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jbtfr" Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.718413 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jbtfr" event={"ID":"aa280405-236d-4a24-896d-04a2dfad8a3a","Type":"ContainerDied","Data":"726833c5340eadfc726a17be564e29850dafa50a93c6d0722e08950ba9a91f74"} Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.718469 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jbtfr" event={"ID":"aa280405-236d-4a24-896d-04a2dfad8a3a","Type":"ContainerDied","Data":"79477e1af1d10f20ff2bf7e280a7ee476108ea069779af5f5cdc35424364da3b"} Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.718487 4720 scope.go:117] "RemoveContainer" containerID="726833c5340eadfc726a17be564e29850dafa50a93c6d0722e08950ba9a91f74" Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.721066 4720 generic.go:334] "Generic (PLEG): container finished" podID="306f9668-a044-448f-a14f-81c9726d3008" containerID="359803a342c5c510fb51706cab89d859016c20be09a4df27bb7da03e276e9272" exitCode=0 Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.721120 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-52n8k" event={"ID":"306f9668-a044-448f-a14f-81c9726d3008","Type":"ContainerDied","Data":"359803a342c5c510fb51706cab89d859016c20be09a4df27bb7da03e276e9272"} Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.728034 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v6vwc" event={"ID":"1d6131a5-b63e-42a5-905a-9ed5350a421a","Type":"ContainerStarted","Data":"c0e0fa0c98abdaa51dfb14628a67a65e3156ac9c16cbfadc4748a6699596aa23"} Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.741911 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jbtfr"] Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.745798 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jbtfr"] Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.777148 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-v6vwc" podStartSLOduration=6.188751924 podStartE2EDuration="1m37.777126856s" podCreationTimestamp="2026-01-21 14:30:05 +0000 UTC" firstStartedPulling="2026-01-21 14:30:08.862541154 +0000 UTC m=+46.771281096" lastFinishedPulling="2026-01-21 14:31:40.450916096 +0000 UTC m=+138.359656028" observedRunningTime="2026-01-21 14:31:42.776430263 +0000 UTC m=+140.685170215" watchObservedRunningTime="2026-01-21 14:31:42.777126856 +0000 UTC m=+140.685866798" Jan 21 14:31:43 crc kubenswrapper[4720]: I0121 14:31:43.107487 4720 scope.go:117] "RemoveContainer" containerID="d984e2a72a467830151f365cfcb9b92dbb5585bf3537764c63136e4213e8bf37" Jan 21 14:31:44 crc kubenswrapper[4720]: I0121 14:31:44.683627 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa280405-236d-4a24-896d-04a2dfad8a3a" path="/var/lib/kubelet/pods/aa280405-236d-4a24-896d-04a2dfad8a3a/volumes" Jan 21 14:31:45 crc kubenswrapper[4720]: I0121 14:31:45.871080 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-v6vwc" Jan 21 14:31:45 crc kubenswrapper[4720]: I0121 14:31:45.871948 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-v6vwc" Jan 21 14:31:45 crc kubenswrapper[4720]: I0121 14:31:45.920905 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-v6vwc" Jan 21 14:31:46 crc kubenswrapper[4720]: I0121 14:31:46.077011 4720 scope.go:117] "RemoveContainer" containerID="c1a8d2ebb0742bda9eb8111d769438c7e4a1020d8851b592dee2b79a7eae798a" Jan 21 14:31:46 crc kubenswrapper[4720]: I0121 14:31:46.094574 4720 scope.go:117] "RemoveContainer" containerID="726833c5340eadfc726a17be564e29850dafa50a93c6d0722e08950ba9a91f74" Jan 21 14:31:46 crc kubenswrapper[4720]: E0121 14:31:46.095807 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"726833c5340eadfc726a17be564e29850dafa50a93c6d0722e08950ba9a91f74\": container with ID starting with 726833c5340eadfc726a17be564e29850dafa50a93c6d0722e08950ba9a91f74 not found: ID does not exist" containerID="726833c5340eadfc726a17be564e29850dafa50a93c6d0722e08950ba9a91f74" Jan 21 14:31:46 crc kubenswrapper[4720]: I0121 14:31:46.095905 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"726833c5340eadfc726a17be564e29850dafa50a93c6d0722e08950ba9a91f74"} err="failed to get container status \"726833c5340eadfc726a17be564e29850dafa50a93c6d0722e08950ba9a91f74\": rpc error: code = NotFound desc = could not find container \"726833c5340eadfc726a17be564e29850dafa50a93c6d0722e08950ba9a91f74\": container with ID starting with 726833c5340eadfc726a17be564e29850dafa50a93c6d0722e08950ba9a91f74 not found: ID does not exist" Jan 21 14:31:46 crc kubenswrapper[4720]: I0121 14:31:46.096006 4720 scope.go:117] "RemoveContainer" containerID="d984e2a72a467830151f365cfcb9b92dbb5585bf3537764c63136e4213e8bf37" Jan 21 14:31:46 crc kubenswrapper[4720]: E0121 14:31:46.096463 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d984e2a72a467830151f365cfcb9b92dbb5585bf3537764c63136e4213e8bf37\": container with ID starting with d984e2a72a467830151f365cfcb9b92dbb5585bf3537764c63136e4213e8bf37 not found: ID does not exist" containerID="d984e2a72a467830151f365cfcb9b92dbb5585bf3537764c63136e4213e8bf37" Jan 21 14:31:46 crc kubenswrapper[4720]: I0121 14:31:46.096549 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d984e2a72a467830151f365cfcb9b92dbb5585bf3537764c63136e4213e8bf37"} err="failed to get container status \"d984e2a72a467830151f365cfcb9b92dbb5585bf3537764c63136e4213e8bf37\": rpc error: code = NotFound desc = could not find container \"d984e2a72a467830151f365cfcb9b92dbb5585bf3537764c63136e4213e8bf37\": container with ID starting with d984e2a72a467830151f365cfcb9b92dbb5585bf3537764c63136e4213e8bf37 not found: ID does not exist" Jan 21 14:31:46 crc kubenswrapper[4720]: I0121 14:31:46.096633 4720 scope.go:117] "RemoveContainer" containerID="c1a8d2ebb0742bda9eb8111d769438c7e4a1020d8851b592dee2b79a7eae798a" Jan 21 14:31:46 crc kubenswrapper[4720]: E0121 14:31:46.097126 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1a8d2ebb0742bda9eb8111d769438c7e4a1020d8851b592dee2b79a7eae798a\": container with ID starting with c1a8d2ebb0742bda9eb8111d769438c7e4a1020d8851b592dee2b79a7eae798a not found: ID does not exist" containerID="c1a8d2ebb0742bda9eb8111d769438c7e4a1020d8851b592dee2b79a7eae798a" Jan 21 14:31:46 crc kubenswrapper[4720]: I0121 14:31:46.097252 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1a8d2ebb0742bda9eb8111d769438c7e4a1020d8851b592dee2b79a7eae798a"} err="failed to get container status \"c1a8d2ebb0742bda9eb8111d769438c7e4a1020d8851b592dee2b79a7eae798a\": rpc error: code = NotFound desc = could not find container \"c1a8d2ebb0742bda9eb8111d769438c7e4a1020d8851b592dee2b79a7eae798a\": container with ID starting with c1a8d2ebb0742bda9eb8111d769438c7e4a1020d8851b592dee2b79a7eae798a not found: ID does not exist" Jan 21 14:31:46 crc kubenswrapper[4720]: I0121 14:31:46.784896 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-v6vwc" Jan 21 14:31:49 crc kubenswrapper[4720]: I0121 14:31:49.040915 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-x7575" Jan 21 14:31:49 crc kubenswrapper[4720]: I0121 14:31:49.041252 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-x7575" Jan 21 14:31:49 crc kubenswrapper[4720]: I0121 14:31:49.081407 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-x7575" Jan 21 14:31:49 crc kubenswrapper[4720]: I0121 14:31:49.140119 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7xcc8"] Jan 21 14:31:49 crc kubenswrapper[4720]: I0121 14:31:49.805555 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-x7575" Jan 21 14:31:52 crc kubenswrapper[4720]: I0121 14:31:52.880322 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:31:52 crc kubenswrapper[4720]: I0121 14:31:52.880388 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.053536 4720 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 21 14:31:54 crc kubenswrapper[4720]: E0121 14:31:54.053910 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d8131be-bd51-4ed7-bb5c-57990adf304a" containerName="pruner" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.053930 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d8131be-bd51-4ed7-bb5c-57990adf304a" containerName="pruner" Jan 21 14:31:54 crc kubenswrapper[4720]: E0121 14:31:54.053952 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75c0e088-7bdf-47f4-b434-b184e742d40a" containerName="kube-multus-additional-cni-plugins" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.053964 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="75c0e088-7bdf-47f4-b434-b184e742d40a" containerName="kube-multus-additional-cni-plugins" Jan 21 14:31:54 crc kubenswrapper[4720]: E0121 14:31:54.053983 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d436685f-1f7d-454b-afa4-76389c5c5ff4" containerName="registry-server" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.053995 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="d436685f-1f7d-454b-afa4-76389c5c5ff4" containerName="registry-server" Jan 21 14:31:54 crc kubenswrapper[4720]: E0121 14:31:54.054019 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa280405-236d-4a24-896d-04a2dfad8a3a" containerName="extract-content" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.054032 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa280405-236d-4a24-896d-04a2dfad8a3a" containerName="extract-content" Jan 21 14:31:54 crc kubenswrapper[4720]: E0121 14:31:54.054052 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d436685f-1f7d-454b-afa4-76389c5c5ff4" containerName="extract-content" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.054065 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="d436685f-1f7d-454b-afa4-76389c5c5ff4" containerName="extract-content" Jan 21 14:31:54 crc kubenswrapper[4720]: E0121 14:31:54.054082 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa280405-236d-4a24-896d-04a2dfad8a3a" containerName="registry-server" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.054093 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa280405-236d-4a24-896d-04a2dfad8a3a" containerName="registry-server" Jan 21 14:31:54 crc kubenswrapper[4720]: E0121 14:31:54.054111 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa280405-236d-4a24-896d-04a2dfad8a3a" containerName="extract-utilities" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.054123 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa280405-236d-4a24-896d-04a2dfad8a3a" containerName="extract-utilities" Jan 21 14:31:54 crc kubenswrapper[4720]: E0121 14:31:54.054144 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d436685f-1f7d-454b-afa4-76389c5c5ff4" containerName="extract-utilities" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.054156 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="d436685f-1f7d-454b-afa4-76389c5c5ff4" containerName="extract-utilities" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.054338 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="75c0e088-7bdf-47f4-b434-b184e742d40a" containerName="kube-multus-additional-cni-plugins" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.054358 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="d436685f-1f7d-454b-afa4-76389c5c5ff4" containerName="registry-server" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.054373 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa280405-236d-4a24-896d-04a2dfad8a3a" containerName="registry-server" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.054387 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d8131be-bd51-4ed7-bb5c-57990adf304a" containerName="pruner" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.054905 4720 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.055053 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.055471 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://696c35d112b31d6eeef69581faf96012141e7e530d66d1fc6e359dbaa43d115d" gracePeriod=15 Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.055594 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://8fa84d74fb16eedfc1ec6d2bbf04ec9d74b586ed9aa369ad2f15938af4d956c2" gracePeriod=15 Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.055727 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://c0f82adfc2ca5b7537bfd8bdf7682f4ec9c8303dcf4e2f9cf82d3523d2d2f26f" gracePeriod=15 Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.055642 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://07909e8fcad66d4eb5e5d5970161495fba15d03881bec826cc68c42b6e39a179" gracePeriod=15 Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.055513 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://4a142cc64cbb49f253527712c0d0f4fb3537591d888594bcc703d033b47f67b1" gracePeriod=15 Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.057496 4720 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 21 14:31:54 crc kubenswrapper[4720]: E0121 14:31:54.058834 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.058871 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 21 14:31:54 crc kubenswrapper[4720]: E0121 14:31:54.058958 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.058977 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 21 14:31:54 crc kubenswrapper[4720]: E0121 14:31:54.059189 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.059262 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 21 14:31:54 crc kubenswrapper[4720]: E0121 14:31:54.059287 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.059355 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 21 14:31:54 crc kubenswrapper[4720]: E0121 14:31:54.059389 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.059455 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 21 14:31:54 crc kubenswrapper[4720]: E0121 14:31:54.059484 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.059502 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.060010 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.060044 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.060073 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.060095 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.060120 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 21 14:31:54 crc kubenswrapper[4720]: E0121 14:31:54.060359 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.060382 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.060702 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.095345 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.252703 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.252793 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.252842 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.252902 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.253451 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.253497 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.253555 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.253592 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.354985 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.355409 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.355123 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.355601 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.356017 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.356250 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.356466 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.356405 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.356193 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.356610 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.357001 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.357217 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.357161 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.357357 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.357613 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.357428 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.386752 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 14:31:55 crc kubenswrapper[4720]: E0121 14:31:55.797706 4720 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:55 crc kubenswrapper[4720]: E0121 14:31:55.799816 4720 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:55 crc kubenswrapper[4720]: E0121 14:31:55.800345 4720 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:55 crc kubenswrapper[4720]: E0121 14:31:55.800710 4720 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:55 crc kubenswrapper[4720]: E0121 14:31:55.800921 4720 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:55 crc kubenswrapper[4720]: I0121 14:31:55.800949 4720 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 21 14:31:55 crc kubenswrapper[4720]: E0121 14:31:55.801118 4720 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" interval="200ms" Jan 21 14:31:55 crc kubenswrapper[4720]: I0121 14:31:55.804877 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 21 14:31:55 crc kubenswrapper[4720]: I0121 14:31:55.807057 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 21 14:31:55 crc kubenswrapper[4720]: I0121 14:31:55.808505 4720 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4a142cc64cbb49f253527712c0d0f4fb3537591d888594bcc703d033b47f67b1" exitCode=0 Jan 21 14:31:55 crc kubenswrapper[4720]: I0121 14:31:55.808543 4720 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="07909e8fcad66d4eb5e5d5970161495fba15d03881bec826cc68c42b6e39a179" exitCode=0 Jan 21 14:31:55 crc kubenswrapper[4720]: I0121 14:31:55.808552 4720 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8fa84d74fb16eedfc1ec6d2bbf04ec9d74b586ed9aa369ad2f15938af4d956c2" exitCode=0 Jan 21 14:31:55 crc kubenswrapper[4720]: I0121 14:31:55.808563 4720 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c0f82adfc2ca5b7537bfd8bdf7682f4ec9c8303dcf4e2f9cf82d3523d2d2f26f" exitCode=2 Jan 21 14:31:55 crc kubenswrapper[4720]: I0121 14:31:55.808605 4720 scope.go:117] "RemoveContainer" containerID="f11ff07fdbcacd193207581f957a529571688ff0d97b7a999b16778f106da754" Jan 21 14:31:55 crc kubenswrapper[4720]: I0121 14:31:55.810875 4720 generic.go:334] "Generic (PLEG): container finished" podID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" containerID="7ae518d6f1ac52dac7a894b823c50d52751d81a944e32a3cdcc1dc5e572fb00e" exitCode=0 Jan 21 14:31:55 crc kubenswrapper[4720]: I0121 14:31:55.810911 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d3bb0d67-7131-40e1-818d-5d4fd5c1a725","Type":"ContainerDied","Data":"7ae518d6f1ac52dac7a894b823c50d52751d81a944e32a3cdcc1dc5e572fb00e"} Jan 21 14:31:55 crc kubenswrapper[4720]: I0121 14:31:55.811650 4720 status_manager.go:851] "Failed to get status for pod" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:55 crc kubenswrapper[4720]: I0121 14:31:55.812282 4720 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:56 crc kubenswrapper[4720]: E0121 14:31:56.002626 4720 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" interval="400ms" Jan 21 14:31:56 crc kubenswrapper[4720]: E0121 14:31:56.403318 4720 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" interval="800ms" Jan 21 14:31:56 crc kubenswrapper[4720]: E0121 14:31:56.576917 4720 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.103:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-operators-52n8k.188cc5822f10d1b9 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-operators-52n8k,UID:306f9668-a044-448f-a14f-81c9726d3008,APIVersion:v1,ResourceVersion:28349,FieldPath:spec.containers{registry-server},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\" in 13.321s (13.321s including waiting). Image size: 907837715 bytes.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-21 14:31:56.575814073 +0000 UTC m=+154.484554005,LastTimestamp:2026-01-21 14:31:56.575814073 +0000 UTC m=+154.484554005,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 21 14:31:56 crc kubenswrapper[4720]: W0121 14:31:56.659362 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-5c6ec6c44485fa4df4f4f73e95465073ab06449027ba0b4566b58774a06e0644 WatchSource:0}: Error finding container 5c6ec6c44485fa4df4f4f73e95465073ab06449027ba0b4566b58774a06e0644: Status 404 returned error can't find the container with id 5c6ec6c44485fa4df4f4f73e95465073ab06449027ba0b4566b58774a06e0644 Jan 21 14:31:56 crc kubenswrapper[4720]: I0121 14:31:56.831421 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 21 14:31:56 crc kubenswrapper[4720]: I0121 14:31:56.850338 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"5c6ec6c44485fa4df4f4f73e95465073ab06449027ba0b4566b58774a06e0644"} Jan 21 14:31:56 crc kubenswrapper[4720]: I0121 14:31:56.875770 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-52n8k" event={"ID":"306f9668-a044-448f-a14f-81c9726d3008","Type":"ContainerStarted","Data":"9aac69c47b901a51f1f77aeb2f7ba200d24ee13fa6a5d85f7f3f5f24f22716a3"} Jan 21 14:31:56 crc kubenswrapper[4720]: I0121 14:31:56.882773 4720 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:56 crc kubenswrapper[4720]: I0121 14:31:56.882959 4720 status_manager.go:851] "Failed to get status for pod" podUID="306f9668-a044-448f-a14f-81c9726d3008" pod="openshift-marketplace/redhat-operators-52n8k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-52n8k\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:56 crc kubenswrapper[4720]: I0121 14:31:56.883120 4720 status_manager.go:851] "Failed to get status for pod" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:56 crc kubenswrapper[4720]: I0121 14:31:56.913169 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5qbdf" event={"ID":"4bbb0e48-d287-42fc-a165-86038d2083c9","Type":"ContainerStarted","Data":"37feb5de5a4e232cd57fa0f486f5aa5cb3b8174eff24aad08adef76c135bae97"} Jan 21 14:31:56 crc kubenswrapper[4720]: I0121 14:31:56.914367 4720 status_manager.go:851] "Failed to get status for pod" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:56 crc kubenswrapper[4720]: I0121 14:31:56.914725 4720 status_manager.go:851] "Failed to get status for pod" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" pod="openshift-marketplace/community-operators-5qbdf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5qbdf\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:56 crc kubenswrapper[4720]: I0121 14:31:56.915580 4720 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:56 crc kubenswrapper[4720]: I0121 14:31:56.915880 4720 status_manager.go:851] "Failed to get status for pod" podUID="306f9668-a044-448f-a14f-81c9726d3008" pod="openshift-marketplace/redhat-operators-52n8k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-52n8k\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: E0121 14:31:57.204475 4720 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" interval="1.6s" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.262967 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.263918 4720 status_manager.go:851] "Failed to get status for pod" podUID="306f9668-a044-448f-a14f-81c9726d3008" pod="openshift-marketplace/redhat-operators-52n8k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-52n8k\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.264347 4720 status_manager.go:851] "Failed to get status for pod" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.264782 4720 status_manager.go:851] "Failed to get status for pod" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" pod="openshift-marketplace/community-operators-5qbdf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5qbdf\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.265065 4720 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.297226 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d3bb0d67-7131-40e1-818d-5d4fd5c1a725-kube-api-access\") pod \"d3bb0d67-7131-40e1-818d-5d4fd5c1a725\" (UID: \"d3bb0d67-7131-40e1-818d-5d4fd5c1a725\") " Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.297282 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d3bb0d67-7131-40e1-818d-5d4fd5c1a725-var-lock\") pod \"d3bb0d67-7131-40e1-818d-5d4fd5c1a725\" (UID: \"d3bb0d67-7131-40e1-818d-5d4fd5c1a725\") " Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.297308 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d3bb0d67-7131-40e1-818d-5d4fd5c1a725-kubelet-dir\") pod \"d3bb0d67-7131-40e1-818d-5d4fd5c1a725\" (UID: \"d3bb0d67-7131-40e1-818d-5d4fd5c1a725\") " Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.297590 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3bb0d67-7131-40e1-818d-5d4fd5c1a725-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d3bb0d67-7131-40e1-818d-5d4fd5c1a725" (UID: "d3bb0d67-7131-40e1-818d-5d4fd5c1a725"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.297811 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3bb0d67-7131-40e1-818d-5d4fd5c1a725-var-lock" (OuterVolumeSpecName: "var-lock") pod "d3bb0d67-7131-40e1-818d-5d4fd5c1a725" (UID: "d3bb0d67-7131-40e1-818d-5d4fd5c1a725"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.303087 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3bb0d67-7131-40e1-818d-5d4fd5c1a725-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d3bb0d67-7131-40e1-818d-5d4fd5c1a725" (UID: "d3bb0d67-7131-40e1-818d-5d4fd5c1a725"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.383404 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.384183 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.384755 4720 status_manager.go:851] "Failed to get status for pod" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" pod="openshift-marketplace/community-operators-5qbdf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5qbdf\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.385053 4720 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.385202 4720 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.385351 4720 status_manager.go:851] "Failed to get status for pod" podUID="306f9668-a044-448f-a14f-81c9726d3008" pod="openshift-marketplace/redhat-operators-52n8k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-52n8k\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.385486 4720 status_manager.go:851] "Failed to get status for pod" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.398328 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.398360 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.398376 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.398465 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.398502 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.398524 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.398594 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d3bb0d67-7131-40e1-818d-5d4fd5c1a725-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.398604 4720 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.398613 4720 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.398621 4720 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d3bb0d67-7131-40e1-818d-5d4fd5c1a725-var-lock\") on node \"crc\" DevicePath \"\"" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.398629 4720 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.398637 4720 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d3bb0d67-7131-40e1-818d-5d4fd5c1a725-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.921687 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lt46m" event={"ID":"7bb4c793-0d05-43f9-a9ad-30d9b6b40595","Type":"ContainerStarted","Data":"a7b59502380a4895dd54770bd6aaf04fdda2243417fd15217dfcddfff65770ae"} Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.922594 4720 status_manager.go:851] "Failed to get status for pod" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" pod="openshift-marketplace/community-operators-5qbdf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5qbdf\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.922843 4720 status_manager.go:851] "Failed to get status for pod" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" pod="openshift-marketplace/certified-operators-lt46m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lt46m\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.922946 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d3bb0d67-7131-40e1-818d-5d4fd5c1a725","Type":"ContainerDied","Data":"ca5672ad37f1aef198f44d464304b97b303638dc1b1a0c650de2bde69ba4a59a"} Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.922965 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.922973 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca5672ad37f1aef198f44d464304b97b303638dc1b1a0c650de2bde69ba4a59a" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.923111 4720 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.923646 4720 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.924377 4720 status_manager.go:851] "Failed to get status for pod" podUID="306f9668-a044-448f-a14f-81c9726d3008" pod="openshift-marketplace/redhat-operators-52n8k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-52n8k\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.924754 4720 status_manager.go:851] "Failed to get status for pod" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.925352 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.926794 4720 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="696c35d112b31d6eeef69581faf96012141e7e530d66d1fc6e359dbaa43d115d" exitCode=0 Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.926864 4720 scope.go:117] "RemoveContainer" containerID="4a142cc64cbb49f253527712c0d0f4fb3537591d888594bcc703d033b47f67b1" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.926926 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.928689 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"f212557e6523349c38a5f3904c46244ca2cce5c140b79e09c813b80cd762b5cd"} Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.929563 4720 status_manager.go:851] "Failed to get status for pod" podUID="306f9668-a044-448f-a14f-81c9726d3008" pod="openshift-marketplace/redhat-operators-52n8k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-52n8k\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.929881 4720 status_manager.go:851] "Failed to get status for pod" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.930100 4720 status_manager.go:851] "Failed to get status for pod" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" pod="openshift-marketplace/community-operators-5qbdf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5qbdf\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.930300 4720 status_manager.go:851] "Failed to get status for pod" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" pod="openshift-marketplace/certified-operators-lt46m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lt46m\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.930515 4720 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.931041 4720 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.946334 4720 scope.go:117] "RemoveContainer" containerID="07909e8fcad66d4eb5e5d5970161495fba15d03881bec826cc68c42b6e39a179" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.950322 4720 status_manager.go:851] "Failed to get status for pod" podUID="306f9668-a044-448f-a14f-81c9726d3008" pod="openshift-marketplace/redhat-operators-52n8k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-52n8k\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.951366 4720 status_manager.go:851] "Failed to get status for pod" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.951622 4720 status_manager.go:851] "Failed to get status for pod" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" pod="openshift-marketplace/certified-operators-lt46m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lt46m\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.951924 4720 status_manager.go:851] "Failed to get status for pod" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" pod="openshift-marketplace/community-operators-5qbdf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5qbdf\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.952247 4720 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.952494 4720 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.957436 4720 status_manager.go:851] "Failed to get status for pod" podUID="306f9668-a044-448f-a14f-81c9726d3008" pod="openshift-marketplace/redhat-operators-52n8k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-52n8k\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.958038 4720 status_manager.go:851] "Failed to get status for pod" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.958184 4720 status_manager.go:851] "Failed to get status for pod" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" pod="openshift-marketplace/community-operators-5qbdf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5qbdf\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.958356 4720 status_manager.go:851] "Failed to get status for pod" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" pod="openshift-marketplace/certified-operators-lt46m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lt46m\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.958537 4720 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.958774 4720 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.964097 4720 scope.go:117] "RemoveContainer" containerID="8fa84d74fb16eedfc1ec6d2bbf04ec9d74b586ed9aa369ad2f15938af4d956c2" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.980863 4720 scope.go:117] "RemoveContainer" containerID="c0f82adfc2ca5b7537bfd8bdf7682f4ec9c8303dcf4e2f9cf82d3523d2d2f26f" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.998924 4720 scope.go:117] "RemoveContainer" containerID="696c35d112b31d6eeef69581faf96012141e7e530d66d1fc6e359dbaa43d115d" Jan 21 14:31:58 crc kubenswrapper[4720]: I0121 14:31:58.017165 4720 scope.go:117] "RemoveContainer" containerID="96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b" Jan 21 14:31:58 crc kubenswrapper[4720]: I0121 14:31:58.053212 4720 scope.go:117] "RemoveContainer" containerID="4a142cc64cbb49f253527712c0d0f4fb3537591d888594bcc703d033b47f67b1" Jan 21 14:31:58 crc kubenswrapper[4720]: E0121 14:31:58.053826 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a142cc64cbb49f253527712c0d0f4fb3537591d888594bcc703d033b47f67b1\": container with ID starting with 4a142cc64cbb49f253527712c0d0f4fb3537591d888594bcc703d033b47f67b1 not found: ID does not exist" containerID="4a142cc64cbb49f253527712c0d0f4fb3537591d888594bcc703d033b47f67b1" Jan 21 14:31:58 crc kubenswrapper[4720]: I0121 14:31:58.053864 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a142cc64cbb49f253527712c0d0f4fb3537591d888594bcc703d033b47f67b1"} err="failed to get container status \"4a142cc64cbb49f253527712c0d0f4fb3537591d888594bcc703d033b47f67b1\": rpc error: code = NotFound desc = could not find container \"4a142cc64cbb49f253527712c0d0f4fb3537591d888594bcc703d033b47f67b1\": container with ID starting with 4a142cc64cbb49f253527712c0d0f4fb3537591d888594bcc703d033b47f67b1 not found: ID does not exist" Jan 21 14:31:58 crc kubenswrapper[4720]: I0121 14:31:58.053891 4720 scope.go:117] "RemoveContainer" containerID="07909e8fcad66d4eb5e5d5970161495fba15d03881bec826cc68c42b6e39a179" Jan 21 14:31:58 crc kubenswrapper[4720]: E0121 14:31:58.054257 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07909e8fcad66d4eb5e5d5970161495fba15d03881bec826cc68c42b6e39a179\": container with ID starting with 07909e8fcad66d4eb5e5d5970161495fba15d03881bec826cc68c42b6e39a179 not found: ID does not exist" containerID="07909e8fcad66d4eb5e5d5970161495fba15d03881bec826cc68c42b6e39a179" Jan 21 14:31:58 crc kubenswrapper[4720]: I0121 14:31:58.054282 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07909e8fcad66d4eb5e5d5970161495fba15d03881bec826cc68c42b6e39a179"} err="failed to get container status \"07909e8fcad66d4eb5e5d5970161495fba15d03881bec826cc68c42b6e39a179\": rpc error: code = NotFound desc = could not find container \"07909e8fcad66d4eb5e5d5970161495fba15d03881bec826cc68c42b6e39a179\": container with ID starting with 07909e8fcad66d4eb5e5d5970161495fba15d03881bec826cc68c42b6e39a179 not found: ID does not exist" Jan 21 14:31:58 crc kubenswrapper[4720]: I0121 14:31:58.054299 4720 scope.go:117] "RemoveContainer" containerID="8fa84d74fb16eedfc1ec6d2bbf04ec9d74b586ed9aa369ad2f15938af4d956c2" Jan 21 14:31:58 crc kubenswrapper[4720]: E0121 14:31:58.054635 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fa84d74fb16eedfc1ec6d2bbf04ec9d74b586ed9aa369ad2f15938af4d956c2\": container with ID starting with 8fa84d74fb16eedfc1ec6d2bbf04ec9d74b586ed9aa369ad2f15938af4d956c2 not found: ID does not exist" containerID="8fa84d74fb16eedfc1ec6d2bbf04ec9d74b586ed9aa369ad2f15938af4d956c2" Jan 21 14:31:58 crc kubenswrapper[4720]: I0121 14:31:58.054683 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fa84d74fb16eedfc1ec6d2bbf04ec9d74b586ed9aa369ad2f15938af4d956c2"} err="failed to get container status \"8fa84d74fb16eedfc1ec6d2bbf04ec9d74b586ed9aa369ad2f15938af4d956c2\": rpc error: code = NotFound desc = could not find container \"8fa84d74fb16eedfc1ec6d2bbf04ec9d74b586ed9aa369ad2f15938af4d956c2\": container with ID starting with 8fa84d74fb16eedfc1ec6d2bbf04ec9d74b586ed9aa369ad2f15938af4d956c2 not found: ID does not exist" Jan 21 14:31:58 crc kubenswrapper[4720]: I0121 14:31:58.054703 4720 scope.go:117] "RemoveContainer" containerID="c0f82adfc2ca5b7537bfd8bdf7682f4ec9c8303dcf4e2f9cf82d3523d2d2f26f" Jan 21 14:31:58 crc kubenswrapper[4720]: E0121 14:31:58.054961 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0f82adfc2ca5b7537bfd8bdf7682f4ec9c8303dcf4e2f9cf82d3523d2d2f26f\": container with ID starting with c0f82adfc2ca5b7537bfd8bdf7682f4ec9c8303dcf4e2f9cf82d3523d2d2f26f not found: ID does not exist" containerID="c0f82adfc2ca5b7537bfd8bdf7682f4ec9c8303dcf4e2f9cf82d3523d2d2f26f" Jan 21 14:31:58 crc kubenswrapper[4720]: I0121 14:31:58.054990 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0f82adfc2ca5b7537bfd8bdf7682f4ec9c8303dcf4e2f9cf82d3523d2d2f26f"} err="failed to get container status \"c0f82adfc2ca5b7537bfd8bdf7682f4ec9c8303dcf4e2f9cf82d3523d2d2f26f\": rpc error: code = NotFound desc = could not find container \"c0f82adfc2ca5b7537bfd8bdf7682f4ec9c8303dcf4e2f9cf82d3523d2d2f26f\": container with ID starting with c0f82adfc2ca5b7537bfd8bdf7682f4ec9c8303dcf4e2f9cf82d3523d2d2f26f not found: ID does not exist" Jan 21 14:31:58 crc kubenswrapper[4720]: I0121 14:31:58.055007 4720 scope.go:117] "RemoveContainer" containerID="696c35d112b31d6eeef69581faf96012141e7e530d66d1fc6e359dbaa43d115d" Jan 21 14:31:58 crc kubenswrapper[4720]: E0121 14:31:58.055372 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"696c35d112b31d6eeef69581faf96012141e7e530d66d1fc6e359dbaa43d115d\": container with ID starting with 696c35d112b31d6eeef69581faf96012141e7e530d66d1fc6e359dbaa43d115d not found: ID does not exist" containerID="696c35d112b31d6eeef69581faf96012141e7e530d66d1fc6e359dbaa43d115d" Jan 21 14:31:58 crc kubenswrapper[4720]: I0121 14:31:58.055396 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"696c35d112b31d6eeef69581faf96012141e7e530d66d1fc6e359dbaa43d115d"} err="failed to get container status \"696c35d112b31d6eeef69581faf96012141e7e530d66d1fc6e359dbaa43d115d\": rpc error: code = NotFound desc = could not find container \"696c35d112b31d6eeef69581faf96012141e7e530d66d1fc6e359dbaa43d115d\": container with ID starting with 696c35d112b31d6eeef69581faf96012141e7e530d66d1fc6e359dbaa43d115d not found: ID does not exist" Jan 21 14:31:58 crc kubenswrapper[4720]: I0121 14:31:58.055412 4720 scope.go:117] "RemoveContainer" containerID="96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b" Jan 21 14:31:58 crc kubenswrapper[4720]: E0121 14:31:58.055706 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b\": container with ID starting with 96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b not found: ID does not exist" containerID="96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b" Jan 21 14:31:58 crc kubenswrapper[4720]: I0121 14:31:58.055728 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b"} err="failed to get container status \"96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b\": rpc error: code = NotFound desc = could not find container \"96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b\": container with ID starting with 96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b not found: ID does not exist" Jan 21 14:31:58 crc kubenswrapper[4720]: I0121 14:31:58.683982 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Jan 21 14:31:58 crc kubenswrapper[4720]: E0121 14:31:58.806112 4720 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" interval="3.2s" Jan 21 14:31:59 crc kubenswrapper[4720]: I0121 14:31:59.296529 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-52n8k" Jan 21 14:31:59 crc kubenswrapper[4720]: I0121 14:31:59.296855 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-52n8k" Jan 21 14:31:59 crc kubenswrapper[4720]: E0121 14:31:59.885036 4720 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.103:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-operators-52n8k.188cc5822f10d1b9 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-operators-52n8k,UID:306f9668-a044-448f-a14f-81c9726d3008,APIVersion:v1,ResourceVersion:28349,FieldPath:spec.containers{registry-server},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\" in 13.321s (13.321s including waiting). Image size: 907837715 bytes.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-21 14:31:56.575814073 +0000 UTC m=+154.484554005,LastTimestamp:2026-01-21 14:31:56.575814073 +0000 UTC m=+154.484554005,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 21 14:32:00 crc kubenswrapper[4720]: I0121 14:32:00.333411 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-52n8k" podUID="306f9668-a044-448f-a14f-81c9726d3008" containerName="registry-server" probeResult="failure" output=< Jan 21 14:32:00 crc kubenswrapper[4720]: timeout: failed to connect service ":50051" within 1s Jan 21 14:32:00 crc kubenswrapper[4720]: > Jan 21 14:32:02 crc kubenswrapper[4720]: E0121 14:32:02.007451 4720 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" interval="6.4s" Jan 21 14:32:02 crc kubenswrapper[4720]: I0121 14:32:02.683463 4720 status_manager.go:851] "Failed to get status for pod" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" pod="openshift-marketplace/community-operators-5qbdf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5qbdf\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:02 crc kubenswrapper[4720]: I0121 14:32:02.684154 4720 status_manager.go:851] "Failed to get status for pod" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" pod="openshift-marketplace/certified-operators-lt46m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lt46m\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:02 crc kubenswrapper[4720]: I0121 14:32:02.684553 4720 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:02 crc kubenswrapper[4720]: I0121 14:32:02.684838 4720 status_manager.go:851] "Failed to get status for pod" podUID="306f9668-a044-448f-a14f-81c9726d3008" pod="openshift-marketplace/redhat-operators-52n8k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-52n8k\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:02 crc kubenswrapper[4720]: I0121 14:32:02.685259 4720 status_manager.go:851] "Failed to get status for pod" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.215757 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5qbdf" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.216096 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5qbdf" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.259126 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5qbdf" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.259908 4720 status_manager.go:851] "Failed to get status for pod" podUID="306f9668-a044-448f-a14f-81c9726d3008" pod="openshift-marketplace/redhat-operators-52n8k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-52n8k\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.260436 4720 status_manager.go:851] "Failed to get status for pod" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.260761 4720 status_manager.go:851] "Failed to get status for pod" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" pod="openshift-marketplace/community-operators-5qbdf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5qbdf\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.260976 4720 status_manager.go:851] "Failed to get status for pod" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" pod="openshift-marketplace/certified-operators-lt46m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lt46m\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.261205 4720 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.332289 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lt46m" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.332390 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lt46m" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.375583 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lt46m" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.376098 4720 status_manager.go:851] "Failed to get status for pod" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.376309 4720 status_manager.go:851] "Failed to get status for pod" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" pod="openshift-marketplace/community-operators-5qbdf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5qbdf\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.376445 4720 status_manager.go:851] "Failed to get status for pod" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" pod="openshift-marketplace/certified-operators-lt46m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lt46m\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.376580 4720 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.376759 4720 status_manager.go:851] "Failed to get status for pod" podUID="306f9668-a044-448f-a14f-81c9726d3008" pod="openshift-marketplace/redhat-operators-52n8k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-52n8k\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.677826 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.678543 4720 status_manager.go:851] "Failed to get status for pod" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" pod="openshift-marketplace/community-operators-5qbdf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5qbdf\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.679707 4720 status_manager.go:851] "Failed to get status for pod" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" pod="openshift-marketplace/certified-operators-lt46m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lt46m\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.692930 4720 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.693482 4720 status_manager.go:851] "Failed to get status for pod" podUID="306f9668-a044-448f-a14f-81c9726d3008" pod="openshift-marketplace/redhat-operators-52n8k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-52n8k\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.693948 4720 status_manager.go:851] "Failed to get status for pod" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.709872 4720 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="41e4ae1e-77b8-40b8-9f64-1eba5a39188a" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.709902 4720 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="41e4ae1e-77b8-40b8-9f64-1eba5a39188a" Jan 21 14:32:06 crc kubenswrapper[4720]: E0121 14:32:06.710230 4720 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.710554 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:32:07 crc kubenswrapper[4720]: I0121 14:32:07.162132 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2bc5237c8b9ecf69a56904a38086b1b556b9da46c038ef02ceb834c19e501708"} Jan 21 14:32:07 crc kubenswrapper[4720]: I0121 14:32:07.208257 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lt46m" Jan 21 14:32:07 crc kubenswrapper[4720]: I0121 14:32:07.208752 4720 status_manager.go:851] "Failed to get status for pod" podUID="306f9668-a044-448f-a14f-81c9726d3008" pod="openshift-marketplace/redhat-operators-52n8k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-52n8k\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:07 crc kubenswrapper[4720]: I0121 14:32:07.209172 4720 status_manager.go:851] "Failed to get status for pod" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:07 crc kubenswrapper[4720]: I0121 14:32:07.209644 4720 status_manager.go:851] "Failed to get status for pod" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" pod="openshift-marketplace/community-operators-5qbdf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5qbdf\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:07 crc kubenswrapper[4720]: I0121 14:32:07.209922 4720 status_manager.go:851] "Failed to get status for pod" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" pod="openshift-marketplace/certified-operators-lt46m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lt46m\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:07 crc kubenswrapper[4720]: I0121 14:32:07.210141 4720 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:07 crc kubenswrapper[4720]: I0121 14:32:07.212053 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5qbdf" Jan 21 14:32:07 crc kubenswrapper[4720]: I0121 14:32:07.212544 4720 status_manager.go:851] "Failed to get status for pod" podUID="306f9668-a044-448f-a14f-81c9726d3008" pod="openshift-marketplace/redhat-operators-52n8k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-52n8k\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:07 crc kubenswrapper[4720]: I0121 14:32:07.212882 4720 status_manager.go:851] "Failed to get status for pod" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:07 crc kubenswrapper[4720]: I0121 14:32:07.213113 4720 status_manager.go:851] "Failed to get status for pod" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" pod="openshift-marketplace/community-operators-5qbdf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5qbdf\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:07 crc kubenswrapper[4720]: I0121 14:32:07.213317 4720 status_manager.go:851] "Failed to get status for pod" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" pod="openshift-marketplace/certified-operators-lt46m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lt46m\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:07 crc kubenswrapper[4720]: I0121 14:32:07.213576 4720 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:07 crc kubenswrapper[4720]: I0121 14:32:07.888334 4720 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Jan 21 14:32:07 crc kubenswrapper[4720]: I0121 14:32:07.888413 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.169219 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.169261 4720 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="d0bce3b2637abe56352f67fde03ad8f25f3e40b810255de8fe4eb60361bae579" exitCode=1 Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.169304 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"d0bce3b2637abe56352f67fde03ad8f25f3e40b810255de8fe4eb60361bae579"} Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.169710 4720 scope.go:117] "RemoveContainer" containerID="d0bce3b2637abe56352f67fde03ad8f25f3e40b810255de8fe4eb60361bae579" Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.170384 4720 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.170553 4720 status_manager.go:851] "Failed to get status for pod" podUID="306f9668-a044-448f-a14f-81c9726d3008" pod="openshift-marketplace/redhat-operators-52n8k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-52n8k\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.170733 4720 status_manager.go:851] "Failed to get status for pod" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.171754 4720 status_manager.go:851] "Failed to get status for pod" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" pod="openshift-marketplace/community-operators-5qbdf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5qbdf\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.172118 4720 status_manager.go:851] "Failed to get status for pod" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" pod="openshift-marketplace/certified-operators-lt46m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lt46m\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.172467 4720 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.172541 4720 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="c299bdc80035bd628c26aad417556c217ec2e0c882354e8f0927d781424a2196" exitCode=0 Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.173120 4720 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="41e4ae1e-77b8-40b8-9f64-1eba5a39188a" Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.173138 4720 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="41e4ae1e-77b8-40b8-9f64-1eba5a39188a" Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.173300 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"c299bdc80035bd628c26aad417556c217ec2e0c882354e8f0927d781424a2196"} Jan 21 14:32:08 crc kubenswrapper[4720]: E0121 14:32:08.173524 4720 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.173942 4720 status_manager.go:851] "Failed to get status for pod" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" pod="openshift-marketplace/community-operators-5qbdf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5qbdf\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.174089 4720 status_manager.go:851] "Failed to get status for pod" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" pod="openshift-marketplace/certified-operators-lt46m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lt46m\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.174238 4720 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.174578 4720 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.174842 4720 status_manager.go:851] "Failed to get status for pod" podUID="306f9668-a044-448f-a14f-81c9726d3008" pod="openshift-marketplace/redhat-operators-52n8k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-52n8k\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.175025 4720 status_manager.go:851] "Failed to get status for pod" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:08 crc kubenswrapper[4720]: E0121 14:32:08.408788 4720 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" interval="7s" Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.662553 4720 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 14:32:09 crc kubenswrapper[4720]: I0121 14:32:09.183720 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 21 14:32:09 crc kubenswrapper[4720]: I0121 14:32:09.184066 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9be7bdfb6abd673de4e55113d4c6827bef3eaf0bc98e1aafa71d0bf69cfb4526"} Jan 21 14:32:09 crc kubenswrapper[4720]: I0121 14:32:09.184817 4720 status_manager.go:851] "Failed to get status for pod" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" pod="openshift-marketplace/certified-operators-lt46m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lt46m\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:09 crc kubenswrapper[4720]: I0121 14:32:09.185235 4720 status_manager.go:851] "Failed to get status for pod" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" pod="openshift-marketplace/community-operators-5qbdf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5qbdf\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:09 crc kubenswrapper[4720]: I0121 14:32:09.185741 4720 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:09 crc kubenswrapper[4720]: I0121 14:32:09.186033 4720 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:09 crc kubenswrapper[4720]: I0121 14:32:09.186293 4720 status_manager.go:851] "Failed to get status for pod" podUID="306f9668-a044-448f-a14f-81c9726d3008" pod="openshift-marketplace/redhat-operators-52n8k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-52n8k\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:09 crc kubenswrapper[4720]: I0121 14:32:09.186592 4720 status_manager.go:851] "Failed to get status for pod" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:09 crc kubenswrapper[4720]: I0121 14:32:09.188910 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9ddb8424839ea7398efb0ec5d1f5b0d9464d767a236fcee8bcdabe25128cb3de"} Jan 21 14:32:09 crc kubenswrapper[4720]: I0121 14:32:09.341904 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-52n8k" Jan 21 14:32:09 crc kubenswrapper[4720]: I0121 14:32:09.375775 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-52n8k" Jan 21 14:32:10 crc kubenswrapper[4720]: I0121 14:32:10.197763 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"fea1e70658cfd5de4e665e701f39c9e28f5c14cdf428402c4f50ab922342a7d5"} Jan 21 14:32:10 crc kubenswrapper[4720]: I0121 14:32:10.197807 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8a823adc71da48a678430092ff973edc582150c83e38d1b2d2f0309cb8dab87a"} Jan 21 14:32:10 crc kubenswrapper[4720]: I0121 14:32:10.197817 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"77de9f2a568da0068f1fc67a9d4b39b635594a170420f55df3e0125d9cf9b995"} Jan 21 14:32:10 crc kubenswrapper[4720]: I0121 14:32:10.197826 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bf448f72444848b0e0fbd30a6237824707794428462d9c911c90f0d33ac8da61"} Jan 21 14:32:10 crc kubenswrapper[4720]: I0121 14:32:10.198134 4720 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="41e4ae1e-77b8-40b8-9f64-1eba5a39188a" Jan 21 14:32:10 crc kubenswrapper[4720]: I0121 14:32:10.198148 4720 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="41e4ae1e-77b8-40b8-9f64-1eba5a39188a" Jan 21 14:32:11 crc kubenswrapper[4720]: I0121 14:32:11.711668 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:32:11 crc kubenswrapper[4720]: I0121 14:32:11.711741 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:32:11 crc kubenswrapper[4720]: I0121 14:32:11.716180 4720 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]log ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]etcd ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/start-apiserver-admission-initializer ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/openshift.io-api-request-count-filter ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/openshift.io-startkubeinformers ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/generic-apiserver-start-informers ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/priority-and-fairness-config-consumer ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/priority-and-fairness-filter ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/storage-object-count-tracker-hook ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/start-apiextensions-informers ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/start-apiextensions-controllers ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/crd-informer-synced ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/start-system-namespaces-controller ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/start-cluster-authentication-info-controller ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/start-legacy-token-tracking-controller ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/start-service-ip-repair-controllers ok Jan 21 14:32:11 crc kubenswrapper[4720]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Jan 21 14:32:11 crc kubenswrapper[4720]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/priority-and-fairness-config-producer ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/bootstrap-controller ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/start-kube-aggregator-informers ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/apiservice-status-local-available-controller ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/apiservice-status-remote-available-controller ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/apiservice-registration-controller ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/apiservice-wait-for-first-sync ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/apiservice-discovery-controller ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/kube-apiserver-autoregistration ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]autoregister-completion ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/apiservice-openapi-controller ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/apiservice-openapiv3-controller ok Jan 21 14:32:11 crc kubenswrapper[4720]: livez check failed Jan 21 14:32:11 crc kubenswrapper[4720]: I0121 14:32:11.716220 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:32:12 crc kubenswrapper[4720]: I0121 14:32:12.746157 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 14:32:12 crc kubenswrapper[4720]: I0121 14:32:12.751668 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 14:32:13 crc kubenswrapper[4720]: I0121 14:32:13.211170 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 14:32:14 crc kubenswrapper[4720]: I0121 14:32:14.197346 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" podUID="45b6b4eb-147f-485e-96e1-5b08ee85ee9f" containerName="oauth-openshift" containerID="cri-o://54ba856e17b73ebe5f3f820b898a179502c0c1d8b3de3c4e102633ebd6d04fe8" gracePeriod=15 Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.221264 4720 generic.go:334] "Generic (PLEG): container finished" podID="45b6b4eb-147f-485e-96e1-5b08ee85ee9f" containerID="54ba856e17b73ebe5f3f820b898a179502c0c1d8b3de3c4e102633ebd6d04fe8" exitCode=0 Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.221365 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" event={"ID":"45b6b4eb-147f-485e-96e1-5b08ee85ee9f","Type":"ContainerDied","Data":"54ba856e17b73ebe5f3f820b898a179502c0c1d8b3de3c4e102633ebd6d04fe8"} Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.756641 4720 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.758092 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.837463 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-cliconfig\") pod \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.837524 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-session\") pod \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.837589 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-audit-dir\") pod \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.837618 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-service-ca\") pod \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.837668 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-trusted-ca-bundle\") pod \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.837708 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-ocp-branding-template\") pod \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.837738 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-serving-cert\") pod \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.837764 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-template-provider-selection\") pod \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.837780 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-router-certs\") pod \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.837801 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-idp-0-file-data\") pod \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.837821 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-template-login\") pod \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.837836 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-audit-policies\") pod \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.837850 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-template-error\") pod \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.837866 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8bf2\" (UniqueName: \"kubernetes.io/projected/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-kube-api-access-p8bf2\") pod \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.838343 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "45b6b4eb-147f-485e-96e1-5b08ee85ee9f" (UID: "45b6b4eb-147f-485e-96e1-5b08ee85ee9f"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.838966 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "45b6b4eb-147f-485e-96e1-5b08ee85ee9f" (UID: "45b6b4eb-147f-485e-96e1-5b08ee85ee9f"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.839027 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "45b6b4eb-147f-485e-96e1-5b08ee85ee9f" (UID: "45b6b4eb-147f-485e-96e1-5b08ee85ee9f"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.839248 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "45b6b4eb-147f-485e-96e1-5b08ee85ee9f" (UID: "45b6b4eb-147f-485e-96e1-5b08ee85ee9f"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.839608 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "45b6b4eb-147f-485e-96e1-5b08ee85ee9f" (UID: "45b6b4eb-147f-485e-96e1-5b08ee85ee9f"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.855706 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "45b6b4eb-147f-485e-96e1-5b08ee85ee9f" (UID: "45b6b4eb-147f-485e-96e1-5b08ee85ee9f"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.856155 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-kube-api-access-p8bf2" (OuterVolumeSpecName: "kube-api-access-p8bf2") pod "45b6b4eb-147f-485e-96e1-5b08ee85ee9f" (UID: "45b6b4eb-147f-485e-96e1-5b08ee85ee9f"). InnerVolumeSpecName "kube-api-access-p8bf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.858932 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "45b6b4eb-147f-485e-96e1-5b08ee85ee9f" (UID: "45b6b4eb-147f-485e-96e1-5b08ee85ee9f"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.859133 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "45b6b4eb-147f-485e-96e1-5b08ee85ee9f" (UID: "45b6b4eb-147f-485e-96e1-5b08ee85ee9f"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.860859 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "45b6b4eb-147f-485e-96e1-5b08ee85ee9f" (UID: "45b6b4eb-147f-485e-96e1-5b08ee85ee9f"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.861101 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "45b6b4eb-147f-485e-96e1-5b08ee85ee9f" (UID: "45b6b4eb-147f-485e-96e1-5b08ee85ee9f"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.867097 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "45b6b4eb-147f-485e-96e1-5b08ee85ee9f" (UID: "45b6b4eb-147f-485e-96e1-5b08ee85ee9f"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.867623 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "45b6b4eb-147f-485e-96e1-5b08ee85ee9f" (UID: "45b6b4eb-147f-485e-96e1-5b08ee85ee9f"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.867801 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "45b6b4eb-147f-485e-96e1-5b08ee85ee9f" (UID: "45b6b4eb-147f-485e-96e1-5b08ee85ee9f"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.939681 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.939713 4720 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.939731 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.939742 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8bf2\" (UniqueName: \"kubernetes.io/projected/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-kube-api-access-p8bf2\") on node \"crc\" DevicePath \"\"" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.939753 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.939762 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.939770 4720 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.939780 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.939789 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.939798 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.939806 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.939816 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.939825 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.939834 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:32:16 crc kubenswrapper[4720]: I0121 14:32:16.005792 4720 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="80054503-2592-4091-a856-52e34e3cdb2b" Jan 21 14:32:16 crc kubenswrapper[4720]: I0121 14:32:16.235413 4720 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="41e4ae1e-77b8-40b8-9f64-1eba5a39188a" Jan 21 14:32:16 crc kubenswrapper[4720]: I0121 14:32:16.235447 4720 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="41e4ae1e-77b8-40b8-9f64-1eba5a39188a" Jan 21 14:32:16 crc kubenswrapper[4720]: I0121 14:32:16.235586 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:32:16 crc kubenswrapper[4720]: I0121 14:32:16.241896 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" event={"ID":"45b6b4eb-147f-485e-96e1-5b08ee85ee9f","Type":"ContainerDied","Data":"30cad834f566f85c0f3a6de4d149c40b4e51c114cf6d66d633ef1b6be4e13903"} Jan 21 14:32:16 crc kubenswrapper[4720]: I0121 14:32:16.241960 4720 scope.go:117] "RemoveContainer" containerID="54ba856e17b73ebe5f3f820b898a179502c0c1d8b3de3c4e102633ebd6d04fe8" Jan 21 14:32:16 crc kubenswrapper[4720]: I0121 14:32:16.242131 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:32:16 crc kubenswrapper[4720]: I0121 14:32:16.257875 4720 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="80054503-2592-4091-a856-52e34e3cdb2b" Jan 21 14:32:16 crc kubenswrapper[4720]: E0121 14:32:16.455909 4720 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-session\": Failed to watch *v1.Secret: unknown (get secrets)" logger="UnhandledError" Jan 21 14:32:16 crc kubenswrapper[4720]: E0121 14:32:16.686324 4720 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-user-idp-0-file-data\": Failed to watch *v1.Secret: unknown (get secrets)" logger="UnhandledError" Jan 21 14:32:16 crc kubenswrapper[4720]: E0121 14:32:16.824707 4720 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-user-template-login\": Failed to watch *v1.Secret: unknown (get secrets)" logger="UnhandledError" Jan 21 14:32:17 crc kubenswrapper[4720]: I0121 14:32:17.245605 4720 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="41e4ae1e-77b8-40b8-9f64-1eba5a39188a" Jan 21 14:32:17 crc kubenswrapper[4720]: I0121 14:32:17.245702 4720 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="41e4ae1e-77b8-40b8-9f64-1eba5a39188a" Jan 21 14:32:17 crc kubenswrapper[4720]: I0121 14:32:17.249190 4720 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="80054503-2592-4091-a856-52e34e3cdb2b" Jan 21 14:32:22 crc kubenswrapper[4720]: I0121 14:32:22.880119 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:32:22 crc kubenswrapper[4720]: I0121 14:32:22.880727 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:32:25 crc kubenswrapper[4720]: I0121 14:32:25.863528 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 21 14:32:26 crc kubenswrapper[4720]: I0121 14:32:26.568751 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 21 14:32:26 crc kubenswrapper[4720]: I0121 14:32:26.815832 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 21 14:32:26 crc kubenswrapper[4720]: I0121 14:32:26.853556 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 21 14:32:27 crc kubenswrapper[4720]: I0121 14:32:27.065639 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 21 14:32:27 crc kubenswrapper[4720]: I0121 14:32:27.087728 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 21 14:32:27 crc kubenswrapper[4720]: I0121 14:32:27.102212 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 21 14:32:27 crc kubenswrapper[4720]: I0121 14:32:27.894760 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 14:32:27 crc kubenswrapper[4720]: I0121 14:32:27.899158 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 21 14:32:28 crc kubenswrapper[4720]: I0121 14:32:28.000108 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 21 14:32:28 crc kubenswrapper[4720]: I0121 14:32:28.122114 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 21 14:32:28 crc kubenswrapper[4720]: I0121 14:32:28.131389 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 21 14:32:28 crc kubenswrapper[4720]: I0121 14:32:28.144171 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 21 14:32:28 crc kubenswrapper[4720]: I0121 14:32:28.211341 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 21 14:32:28 crc kubenswrapper[4720]: I0121 14:32:28.424133 4720 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 21 14:32:28 crc kubenswrapper[4720]: I0121 14:32:28.470368 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 21 14:32:28 crc kubenswrapper[4720]: I0121 14:32:28.565133 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 21 14:32:28 crc kubenswrapper[4720]: I0121 14:32:28.617614 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 21 14:32:28 crc kubenswrapper[4720]: I0121 14:32:28.747560 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 21 14:32:28 crc kubenswrapper[4720]: I0121 14:32:28.788819 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 21 14:32:28 crc kubenswrapper[4720]: I0121 14:32:28.791558 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 21 14:32:29 crc kubenswrapper[4720]: I0121 14:32:29.133513 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 21 14:32:29 crc kubenswrapper[4720]: I0121 14:32:29.183327 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 21 14:32:29 crc kubenswrapper[4720]: I0121 14:32:29.267452 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 21 14:32:29 crc kubenswrapper[4720]: I0121 14:32:29.358767 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 21 14:32:29 crc kubenswrapper[4720]: I0121 14:32:29.496447 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 21 14:32:29 crc kubenswrapper[4720]: I0121 14:32:29.519834 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 21 14:32:29 crc kubenswrapper[4720]: I0121 14:32:29.523994 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 21 14:32:29 crc kubenswrapper[4720]: I0121 14:32:29.549345 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 21 14:32:29 crc kubenswrapper[4720]: I0121 14:32:29.624432 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 21 14:32:29 crc kubenswrapper[4720]: I0121 14:32:29.760815 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 21 14:32:29 crc kubenswrapper[4720]: I0121 14:32:29.764417 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 21 14:32:29 crc kubenswrapper[4720]: I0121 14:32:29.838187 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 21 14:32:29 crc kubenswrapper[4720]: I0121 14:32:29.917551 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 21 14:32:29 crc kubenswrapper[4720]: I0121 14:32:29.959684 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 21 14:32:30 crc kubenswrapper[4720]: I0121 14:32:30.043920 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 21 14:32:30 crc kubenswrapper[4720]: I0121 14:32:30.251810 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 21 14:32:30 crc kubenswrapper[4720]: I0121 14:32:30.290044 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 21 14:32:30 crc kubenswrapper[4720]: I0121 14:32:30.582001 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 21 14:32:30 crc kubenswrapper[4720]: I0121 14:32:30.642419 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 21 14:32:30 crc kubenswrapper[4720]: I0121 14:32:30.674947 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 21 14:32:30 crc kubenswrapper[4720]: I0121 14:32:30.694244 4720 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 21 14:32:30 crc kubenswrapper[4720]: I0121 14:32:30.785899 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 21 14:32:30 crc kubenswrapper[4720]: I0121 14:32:30.830793 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 21 14:32:30 crc kubenswrapper[4720]: I0121 14:32:30.853178 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 21 14:32:30 crc kubenswrapper[4720]: I0121 14:32:30.981021 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 21 14:32:30 crc kubenswrapper[4720]: I0121 14:32:30.981347 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 21 14:32:30 crc kubenswrapper[4720]: I0121 14:32:30.991857 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 21 14:32:30 crc kubenswrapper[4720]: I0121 14:32:30.995018 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 21 14:32:31 crc kubenswrapper[4720]: I0121 14:32:31.087127 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 21 14:32:31 crc kubenswrapper[4720]: I0121 14:32:31.209693 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 21 14:32:31 crc kubenswrapper[4720]: I0121 14:32:31.224213 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 21 14:32:31 crc kubenswrapper[4720]: I0121 14:32:31.271222 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 21 14:32:31 crc kubenswrapper[4720]: I0121 14:32:31.275161 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 21 14:32:31 crc kubenswrapper[4720]: I0121 14:32:31.305851 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 21 14:32:31 crc kubenswrapper[4720]: I0121 14:32:31.398019 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 21 14:32:31 crc kubenswrapper[4720]: I0121 14:32:31.405091 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 21 14:32:31 crc kubenswrapper[4720]: I0121 14:32:31.433682 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 21 14:32:31 crc kubenswrapper[4720]: I0121 14:32:31.470874 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 21 14:32:31 crc kubenswrapper[4720]: I0121 14:32:31.560927 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 21 14:32:31 crc kubenswrapper[4720]: I0121 14:32:31.778408 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 21 14:32:31 crc kubenswrapper[4720]: I0121 14:32:31.794909 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 21 14:32:31 crc kubenswrapper[4720]: I0121 14:32:31.860632 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 21 14:32:31 crc kubenswrapper[4720]: I0121 14:32:31.888741 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 21 14:32:31 crc kubenswrapper[4720]: I0121 14:32:31.967457 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.034539 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.114014 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.257393 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.328459 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.375403 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.454571 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.473061 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.505120 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.607305 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.637329 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.645150 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.677003 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.697312 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.697413 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.699019 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.699132 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.721083 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.723059 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.818366 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.960915 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.990140 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 21 14:32:33 crc kubenswrapper[4720]: I0121 14:32:33.001677 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 21 14:32:33 crc kubenswrapper[4720]: I0121 14:32:33.213035 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 21 14:32:33 crc kubenswrapper[4720]: I0121 14:32:33.234160 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 21 14:32:33 crc kubenswrapper[4720]: I0121 14:32:33.242825 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 21 14:32:33 crc kubenswrapper[4720]: I0121 14:32:33.273820 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 21 14:32:33 crc kubenswrapper[4720]: I0121 14:32:33.297436 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 21 14:32:33 crc kubenswrapper[4720]: I0121 14:32:33.319602 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 21 14:32:33 crc kubenswrapper[4720]: I0121 14:32:33.345094 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 21 14:32:33 crc kubenswrapper[4720]: I0121 14:32:33.396812 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 21 14:32:33 crc kubenswrapper[4720]: I0121 14:32:33.400420 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 21 14:32:33 crc kubenswrapper[4720]: I0121 14:32:33.456639 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 21 14:32:33 crc kubenswrapper[4720]: I0121 14:32:33.541022 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 21 14:32:33 crc kubenswrapper[4720]: I0121 14:32:33.542378 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 21 14:32:33 crc kubenswrapper[4720]: I0121 14:32:33.566133 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 21 14:32:33 crc kubenswrapper[4720]: I0121 14:32:33.590249 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 21 14:32:33 crc kubenswrapper[4720]: I0121 14:32:33.603289 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 21 14:32:33 crc kubenswrapper[4720]: I0121 14:32:33.868493 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 21 14:32:33 crc kubenswrapper[4720]: I0121 14:32:33.941221 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 21 14:32:33 crc kubenswrapper[4720]: I0121 14:32:33.986486 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 21 14:32:33 crc kubenswrapper[4720]: I0121 14:32:33.986491 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 21 14:32:34 crc kubenswrapper[4720]: I0121 14:32:34.087384 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 21 14:32:34 crc kubenswrapper[4720]: I0121 14:32:34.140894 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 21 14:32:34 crc kubenswrapper[4720]: I0121 14:32:34.179291 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 21 14:32:34 crc kubenswrapper[4720]: I0121 14:32:34.188835 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 21 14:32:34 crc kubenswrapper[4720]: I0121 14:32:34.211834 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 21 14:32:34 crc kubenswrapper[4720]: I0121 14:32:34.265944 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 21 14:32:34 crc kubenswrapper[4720]: I0121 14:32:34.416851 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 21 14:32:34 crc kubenswrapper[4720]: I0121 14:32:34.449060 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 21 14:32:34 crc kubenswrapper[4720]: I0121 14:32:34.605935 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 21 14:32:34 crc kubenswrapper[4720]: I0121 14:32:34.612615 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 21 14:32:34 crc kubenswrapper[4720]: I0121 14:32:34.617549 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 21 14:32:34 crc kubenswrapper[4720]: I0121 14:32:34.620105 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 21 14:32:34 crc kubenswrapper[4720]: I0121 14:32:34.627254 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 21 14:32:34 crc kubenswrapper[4720]: I0121 14:32:34.695588 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 21 14:32:34 crc kubenswrapper[4720]: I0121 14:32:34.695779 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 21 14:32:34 crc kubenswrapper[4720]: I0121 14:32:34.846153 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 21 14:32:34 crc kubenswrapper[4720]: I0121 14:32:34.879115 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 21 14:32:34 crc kubenswrapper[4720]: I0121 14:32:34.886352 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 21 14:32:34 crc kubenswrapper[4720]: I0121 14:32:34.895145 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.048624 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.065870 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.082336 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.209413 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.249436 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.288777 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.321043 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.380727 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.414880 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.438004 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.457884 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.495536 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.559150 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.563636 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.573910 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.592515 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.597952 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.615251 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.709801 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.745648 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.847759 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.852543 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.871218 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.880184 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 21 14:32:36 crc kubenswrapper[4720]: I0121 14:32:36.000145 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 21 14:32:36 crc kubenswrapper[4720]: I0121 14:32:36.013591 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 21 14:32:36 crc kubenswrapper[4720]: I0121 14:32:36.067183 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 21 14:32:36 crc kubenswrapper[4720]: I0121 14:32:36.206631 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 21 14:32:36 crc kubenswrapper[4720]: I0121 14:32:36.256695 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 21 14:32:36 crc kubenswrapper[4720]: I0121 14:32:36.303588 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 21 14:32:36 crc kubenswrapper[4720]: I0121 14:32:36.364500 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 21 14:32:36 crc kubenswrapper[4720]: I0121 14:32:36.448914 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 21 14:32:36 crc kubenswrapper[4720]: I0121 14:32:36.600499 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 21 14:32:36 crc kubenswrapper[4720]: I0121 14:32:36.604322 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 21 14:32:36 crc kubenswrapper[4720]: I0121 14:32:36.675579 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 21 14:32:36 crc kubenswrapper[4720]: I0121 14:32:36.738319 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 21 14:32:36 crc kubenswrapper[4720]: I0121 14:32:36.763506 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 21 14:32:36 crc kubenswrapper[4720]: I0121 14:32:36.817598 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 21 14:32:36 crc kubenswrapper[4720]: I0121 14:32:36.894265 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 21 14:32:36 crc kubenswrapper[4720]: I0121 14:32:36.936694 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 21 14:32:36 crc kubenswrapper[4720]: I0121 14:32:36.972125 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 21 14:32:36 crc kubenswrapper[4720]: I0121 14:32:36.994571 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 21 14:32:37 crc kubenswrapper[4720]: I0121 14:32:37.096920 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 21 14:32:37 crc kubenswrapper[4720]: I0121 14:32:37.099542 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 21 14:32:37 crc kubenswrapper[4720]: I0121 14:32:37.155886 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 21 14:32:37 crc kubenswrapper[4720]: I0121 14:32:37.192177 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 21 14:32:37 crc kubenswrapper[4720]: I0121 14:32:37.224987 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 21 14:32:37 crc kubenswrapper[4720]: I0121 14:32:37.296079 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 21 14:32:37 crc kubenswrapper[4720]: I0121 14:32:37.322643 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 21 14:32:37 crc kubenswrapper[4720]: I0121 14:32:37.435816 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 21 14:32:37 crc kubenswrapper[4720]: I0121 14:32:37.510235 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 21 14:32:37 crc kubenswrapper[4720]: I0121 14:32:37.568631 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 21 14:32:37 crc kubenswrapper[4720]: I0121 14:32:37.576878 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 21 14:32:37 crc kubenswrapper[4720]: I0121 14:32:37.632323 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 21 14:32:37 crc kubenswrapper[4720]: I0121 14:32:37.639289 4720 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 21 14:32:37 crc kubenswrapper[4720]: I0121 14:32:37.705072 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 21 14:32:37 crc kubenswrapper[4720]: I0121 14:32:37.774215 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 21 14:32:37 crc kubenswrapper[4720]: I0121 14:32:37.930770 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.024370 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.043207 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.157383 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.236021 4720 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.289898 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.401218 4720 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.450868 4720 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.451865 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-52n8k" podStartSLOduration=44.920246146 podStartE2EDuration="2m30.451845084s" podCreationTimestamp="2026-01-21 14:30:08 +0000 UTC" firstStartedPulling="2026-01-21 14:30:11.044201955 +0000 UTC m=+48.952941897" lastFinishedPulling="2026-01-21 14:31:56.575800893 +0000 UTC m=+154.484540835" observedRunningTime="2026-01-21 14:32:15.78413687 +0000 UTC m=+173.692876802" watchObservedRunningTime="2026-01-21 14:32:38.451845084 +0000 UTC m=+196.360585016" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.452224 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lt46m" podStartSLOduration=57.370809771 podStartE2EDuration="2m33.452218115s" podCreationTimestamp="2026-01-21 14:30:05 +0000 UTC" firstStartedPulling="2026-01-21 14:30:09.995841065 +0000 UTC m=+47.904580997" lastFinishedPulling="2026-01-21 14:31:46.077249409 +0000 UTC m=+143.985989341" observedRunningTime="2026-01-21 14:32:15.887787625 +0000 UTC m=+173.796527587" watchObservedRunningTime="2026-01-21 14:32:38.452218115 +0000 UTC m=+196.360958047" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.453125 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5qbdf" podStartSLOduration=46.828159718 podStartE2EDuration="2m33.453118985s" podCreationTimestamp="2026-01-21 14:30:05 +0000 UTC" firstStartedPulling="2026-01-21 14:30:10.009336254 +0000 UTC m=+47.918076186" lastFinishedPulling="2026-01-21 14:31:56.634295501 +0000 UTC m=+154.543035453" observedRunningTime="2026-01-21 14:32:15.849381305 +0000 UTC m=+173.758121237" watchObservedRunningTime="2026-01-21 14:32:38.453118985 +0000 UTC m=+196.361858927" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.454057 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=44.454052075 podStartE2EDuration="44.454052075s" podCreationTimestamp="2026-01-21 14:31:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:32:15.905414693 +0000 UTC m=+173.814154625" watchObservedRunningTime="2026-01-21 14:32:38.454052075 +0000 UTC m=+196.362792007" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.455783 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7xcc8","openshift-kube-apiserver/kube-apiserver-crc"] Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.455911 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7865b47677-vf9fw","openshift-kube-apiserver/kube-apiserver-crc"] Jan 21 14:32:38 crc kubenswrapper[4720]: E0121 14:32:38.456180 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" containerName="installer" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.456279 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" containerName="installer" Jan 21 14:32:38 crc kubenswrapper[4720]: E0121 14:32:38.456375 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45b6b4eb-147f-485e-96e1-5b08ee85ee9f" containerName="oauth-openshift" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.456445 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="45b6b4eb-147f-485e-96e1-5b08ee85ee9f" containerName="oauth-openshift" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.456298 4720 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="41e4ae1e-77b8-40b8-9f64-1eba5a39188a" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.456691 4720 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="41e4ae1e-77b8-40b8-9f64-1eba5a39188a" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.456611 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="45b6b4eb-147f-485e-96e1-5b08ee85ee9f" containerName="oauth-openshift" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.456854 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" containerName="installer" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.457639 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.461226 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.461573 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.462337 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.462400 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.462561 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.462780 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.465557 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.465875 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.466851 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.466949 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.467521 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.467723 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.468066 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.479849 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.481529 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.485612 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.494338 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=23.494309304 podStartE2EDuration="23.494309304s" podCreationTimestamp="2026-01-21 14:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:32:38.486183232 +0000 UTC m=+196.394923214" watchObservedRunningTime="2026-01-21 14:32:38.494309304 +0000 UTC m=+196.403049276" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.494603 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.589307 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.589944 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.622797 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-service-ca\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.622863 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5aad29d4-274a-49b7-8b0b-8c4c496206fc-audit-dir\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.622882 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.622906 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-session\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.622941 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.622979 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-user-template-login\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.623002 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-user-template-error\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.623026 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.623051 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.623122 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67zx8\" (UniqueName: \"kubernetes.io/projected/5aad29d4-274a-49b7-8b0b-8c4c496206fc-kube-api-access-67zx8\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.623174 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.623203 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-router-certs\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.623220 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5aad29d4-274a-49b7-8b0b-8c4c496206fc-audit-policies\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.623241 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.646409 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.688335 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45b6b4eb-147f-485e-96e1-5b08ee85ee9f" path="/var/lib/kubelet/pods/45b6b4eb-147f-485e-96e1-5b08ee85ee9f/volumes" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.724244 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-session\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.724326 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.724365 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-user-template-login\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.724398 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-user-template-error\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.724445 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.724499 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.724534 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67zx8\" (UniqueName: \"kubernetes.io/projected/5aad29d4-274a-49b7-8b0b-8c4c496206fc-kube-api-access-67zx8\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.724569 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.724639 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-router-certs\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.724710 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5aad29d4-274a-49b7-8b0b-8c4c496206fc-audit-policies\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.724793 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.724847 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-service-ca\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.724886 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.724928 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5aad29d4-274a-49b7-8b0b-8c4c496206fc-audit-dir\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.725091 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5aad29d4-274a-49b7-8b0b-8c4c496206fc-audit-dir\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.726491 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.729476 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.730741 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.731027 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-user-template-error\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.731303 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-session\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.731381 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5aad29d4-274a-49b7-8b0b-8c4c496206fc-audit-policies\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.731493 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-service-ca\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.732132 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-router-certs\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.734948 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-user-template-login\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.735350 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.735837 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.736021 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.738521 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.749630 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67zx8\" (UniqueName: \"kubernetes.io/projected/5aad29d4-274a-49b7-8b0b-8c4c496206fc-kube-api-access-67zx8\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.784514 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.797910 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.820640 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.886488 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.926809 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 21 14:32:39 crc kubenswrapper[4720]: I0121 14:32:39.046624 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 21 14:32:39 crc kubenswrapper[4720]: I0121 14:32:39.056113 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 21 14:32:39 crc kubenswrapper[4720]: I0121 14:32:39.058623 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 21 14:32:39 crc kubenswrapper[4720]: I0121 14:32:39.161262 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 21 14:32:39 crc kubenswrapper[4720]: I0121 14:32:39.162085 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 21 14:32:39 crc kubenswrapper[4720]: I0121 14:32:39.194087 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7865b47677-vf9fw"] Jan 21 14:32:39 crc kubenswrapper[4720]: I0121 14:32:39.333288 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 21 14:32:39 crc kubenswrapper[4720]: I0121 14:32:39.367204 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" event={"ID":"5aad29d4-274a-49b7-8b0b-8c4c496206fc","Type":"ContainerStarted","Data":"87bd5a3fbb62e9dc1c9aaf1dcf20a5689d97d9dc8eff61c3a66f8ec13a46cce8"} Jan 21 14:32:39 crc kubenswrapper[4720]: I0121 14:32:39.459749 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 21 14:32:39 crc kubenswrapper[4720]: I0121 14:32:39.490391 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 21 14:32:39 crc kubenswrapper[4720]: I0121 14:32:39.573823 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 21 14:32:39 crc kubenswrapper[4720]: I0121 14:32:39.596234 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 21 14:32:39 crc kubenswrapper[4720]: I0121 14:32:39.674748 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 21 14:32:39 crc kubenswrapper[4720]: I0121 14:32:39.774975 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 21 14:32:39 crc kubenswrapper[4720]: I0121 14:32:39.780998 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 21 14:32:39 crc kubenswrapper[4720]: I0121 14:32:39.914298 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 21 14:32:39 crc kubenswrapper[4720]: I0121 14:32:39.962765 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 21 14:32:40 crc kubenswrapper[4720]: I0121 14:32:40.156200 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 21 14:32:40 crc kubenswrapper[4720]: I0121 14:32:40.168085 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 21 14:32:40 crc kubenswrapper[4720]: I0121 14:32:40.221208 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 21 14:32:40 crc kubenswrapper[4720]: I0121 14:32:40.226265 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 21 14:32:40 crc kubenswrapper[4720]: I0121 14:32:40.356886 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 21 14:32:40 crc kubenswrapper[4720]: I0121 14:32:40.375229 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" event={"ID":"5aad29d4-274a-49b7-8b0b-8c4c496206fc","Type":"ContainerStarted","Data":"b6c02fa1e53e3273b5182983a84ca77bb1f0c78af6ab01fd4f05e2d46bba0d8b"} Jan 21 14:32:40 crc kubenswrapper[4720]: I0121 14:32:40.375832 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:40 crc kubenswrapper[4720]: I0121 14:32:40.384277 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:40 crc kubenswrapper[4720]: I0121 14:32:40.400775 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" podStartSLOduration=51.400752335 podStartE2EDuration="51.400752335s" podCreationTimestamp="2026-01-21 14:31:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:32:40.399497104 +0000 UTC m=+198.308237076" watchObservedRunningTime="2026-01-21 14:32:40.400752335 +0000 UTC m=+198.309492297" Jan 21 14:32:40 crc kubenswrapper[4720]: I0121 14:32:40.481486 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 21 14:32:40 crc kubenswrapper[4720]: I0121 14:32:40.540865 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 21 14:32:40 crc kubenswrapper[4720]: I0121 14:32:40.641235 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 21 14:32:40 crc kubenswrapper[4720]: I0121 14:32:40.743154 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 21 14:32:40 crc kubenswrapper[4720]: I0121 14:32:40.753738 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 21 14:32:40 crc kubenswrapper[4720]: I0121 14:32:40.827802 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 21 14:32:40 crc kubenswrapper[4720]: I0121 14:32:40.932770 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 21 14:32:40 crc kubenswrapper[4720]: I0121 14:32:40.973328 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 21 14:32:41 crc kubenswrapper[4720]: I0121 14:32:41.264336 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 21 14:32:41 crc kubenswrapper[4720]: I0121 14:32:41.290645 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 21 14:32:41 crc kubenswrapper[4720]: I0121 14:32:41.715137 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:32:41 crc kubenswrapper[4720]: I0121 14:32:41.719516 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:32:41 crc kubenswrapper[4720]: I0121 14:32:41.797330 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 21 14:32:41 crc kubenswrapper[4720]: I0121 14:32:41.813865 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 21 14:32:41 crc kubenswrapper[4720]: I0121 14:32:41.902994 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 21 14:32:41 crc kubenswrapper[4720]: I0121 14:32:41.995562 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 21 14:32:42 crc kubenswrapper[4720]: I0121 14:32:42.094519 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 21 14:32:42 crc kubenswrapper[4720]: I0121 14:32:42.095861 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 21 14:32:42 crc kubenswrapper[4720]: I0121 14:32:42.536848 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 21 14:32:42 crc kubenswrapper[4720]: I0121 14:32:42.662726 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 21 14:32:43 crc kubenswrapper[4720]: I0121 14:32:43.146139 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 21 14:32:48 crc kubenswrapper[4720]: I0121 14:32:48.600099 4720 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 21 14:32:48 crc kubenswrapper[4720]: I0121 14:32:48.600607 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://f212557e6523349c38a5f3904c46244ca2cce5c140b79e09c813b80cd762b5cd" gracePeriod=5 Jan 21 14:32:52 crc kubenswrapper[4720]: I0121 14:32:52.879616 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:32:52 crc kubenswrapper[4720]: I0121 14:32:52.880251 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:32:52 crc kubenswrapper[4720]: I0121 14:32:52.880299 4720 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:32:52 crc kubenswrapper[4720]: I0121 14:32:52.880870 4720 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"926a9b75c9fc74a93dd69c62eb765f3cdb4aeaf1bc918f7c3dc8f79011404240"} pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 14:32:52 crc kubenswrapper[4720]: I0121 14:32:52.880960 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" containerID="cri-o://926a9b75c9fc74a93dd69c62eb765f3cdb4aeaf1bc918f7c3dc8f79011404240" gracePeriod=600 Jan 21 14:32:53 crc kubenswrapper[4720]: I0121 14:32:53.446984 4720 generic.go:334] "Generic (PLEG): container finished" podID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerID="926a9b75c9fc74a93dd69c62eb765f3cdb4aeaf1bc918f7c3dc8f79011404240" exitCode=0 Jan 21 14:32:53 crc kubenswrapper[4720]: I0121 14:32:53.447058 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerDied","Data":"926a9b75c9fc74a93dd69c62eb765f3cdb4aeaf1bc918f7c3dc8f79011404240"} Jan 21 14:32:53 crc kubenswrapper[4720]: I0121 14:32:53.447244 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerStarted","Data":"eab7230c9b1780824322550642987ab8759942bce4be148af7dcc4a247edffb1"} Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.163903 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.164419 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.224752 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.224853 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.224893 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.224914 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.224908 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.224944 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.224988 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.224986 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.225060 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.225158 4720 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.225479 4720 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.225508 4720 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.236250 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.326883 4720 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.326918 4720 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.456157 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.456213 4720 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="f212557e6523349c38a5f3904c46244ca2cce5c140b79e09c813b80cd762b5cd" exitCode=137 Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.456266 4720 scope.go:117] "RemoveContainer" containerID="f212557e6523349c38a5f3904c46244ca2cce5c140b79e09c813b80cd762b5cd" Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.456337 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.476152 4720 scope.go:117] "RemoveContainer" containerID="f212557e6523349c38a5f3904c46244ca2cce5c140b79e09c813b80cd762b5cd" Jan 21 14:32:54 crc kubenswrapper[4720]: E0121 14:32:54.476663 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f212557e6523349c38a5f3904c46244ca2cce5c140b79e09c813b80cd762b5cd\": container with ID starting with f212557e6523349c38a5f3904c46244ca2cce5c140b79e09c813b80cd762b5cd not found: ID does not exist" containerID="f212557e6523349c38a5f3904c46244ca2cce5c140b79e09c813b80cd762b5cd" Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.476735 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f212557e6523349c38a5f3904c46244ca2cce5c140b79e09c813b80cd762b5cd"} err="failed to get container status \"f212557e6523349c38a5f3904c46244ca2cce5c140b79e09c813b80cd762b5cd\": rpc error: code = NotFound desc = could not find container \"f212557e6523349c38a5f3904c46244ca2cce5c140b79e09c813b80cd762b5cd\": container with ID starting with f212557e6523349c38a5f3904c46244ca2cce5c140b79e09c813b80cd762b5cd not found: ID does not exist" Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.686986 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.687904 4720 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.699033 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.699072 4720 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="5fecc3c3-217b-4d49-a894-931812b93b05" Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.701547 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.701582 4720 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="5fecc3c3-217b-4d49-a894-931812b93b05" Jan 21 14:33:11 crc kubenswrapper[4720]: I0121 14:33:11.506305 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9gr25"] Jan 21 14:33:11 crc kubenswrapper[4720]: I0121 14:33:11.507001 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" podUID="03eab9ba-e390-43a8-ab91-b8f0fe8678a0" containerName="controller-manager" containerID="cri-o://566a6d1a5d22263d021363b64e42596b81bd2b7c56316df45c97fc2d4dc05071" gracePeriod=30 Jan 21 14:33:11 crc kubenswrapper[4720]: I0121 14:33:11.603994 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk"] Jan 21 14:33:11 crc kubenswrapper[4720]: I0121 14:33:11.604218 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" podUID="e977ca1c-c59f-4c61-8e47-4d03d3b0ced8" containerName="route-controller-manager" containerID="cri-o://aa3faf8a1f06699eadc416f2c819035756683b2951639a26a35299fd3eb1eeb4" gracePeriod=30 Jan 21 14:33:11 crc kubenswrapper[4720]: I0121 14:33:11.936412 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.007688 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-proxy-ca-bundles\") pod \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\" (UID: \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\") " Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.007744 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-config\") pod \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\" (UID: \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\") " Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.007770 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-serving-cert\") pod \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\" (UID: \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\") " Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.007837 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jj8k9\" (UniqueName: \"kubernetes.io/projected/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-kube-api-access-jj8k9\") pod \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\" (UID: \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\") " Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.007862 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-client-ca\") pod \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\" (UID: \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\") " Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.008610 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "03eab9ba-e390-43a8-ab91-b8f0fe8678a0" (UID: "03eab9ba-e390-43a8-ab91-b8f0fe8678a0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.008633 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-config" (OuterVolumeSpecName: "config") pod "03eab9ba-e390-43a8-ab91-b8f0fe8678a0" (UID: "03eab9ba-e390-43a8-ab91-b8f0fe8678a0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.008683 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-client-ca" (OuterVolumeSpecName: "client-ca") pod "03eab9ba-e390-43a8-ab91-b8f0fe8678a0" (UID: "03eab9ba-e390-43a8-ab91-b8f0fe8678a0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.013406 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-kube-api-access-jj8k9" (OuterVolumeSpecName: "kube-api-access-jj8k9") pod "03eab9ba-e390-43a8-ab91-b8f0fe8678a0" (UID: "03eab9ba-e390-43a8-ab91-b8f0fe8678a0"). InnerVolumeSpecName "kube-api-access-jj8k9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.013805 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "03eab9ba-e390-43a8-ab91-b8f0fe8678a0" (UID: "03eab9ba-e390-43a8-ab91-b8f0fe8678a0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.109485 4720 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.109520 4720 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.109536 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.109546 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.109560 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jj8k9\" (UniqueName: \"kubernetes.io/projected/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-kube-api-access-jj8k9\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.437109 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.514694 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-serving-cert\") pod \"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8\" (UID: \"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8\") " Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.514811 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-client-ca\") pod \"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8\" (UID: \"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8\") " Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.514903 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tf9q\" (UniqueName: \"kubernetes.io/projected/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-kube-api-access-6tf9q\") pod \"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8\" (UID: \"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8\") " Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.514976 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-config\") pod \"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8\" (UID: \"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8\") " Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.516000 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-config" (OuterVolumeSpecName: "config") pod "e977ca1c-c59f-4c61-8e47-4d03d3b0ced8" (UID: "e977ca1c-c59f-4c61-8e47-4d03d3b0ced8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.516018 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-client-ca" (OuterVolumeSpecName: "client-ca") pod "e977ca1c-c59f-4c61-8e47-4d03d3b0ced8" (UID: "e977ca1c-c59f-4c61-8e47-4d03d3b0ced8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.519962 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e977ca1c-c59f-4c61-8e47-4d03d3b0ced8" (UID: "e977ca1c-c59f-4c61-8e47-4d03d3b0ced8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.520300 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-kube-api-access-6tf9q" (OuterVolumeSpecName: "kube-api-access-6tf9q") pod "e977ca1c-c59f-4c61-8e47-4d03d3b0ced8" (UID: "e977ca1c-c59f-4c61-8e47-4d03d3b0ced8"). InnerVolumeSpecName "kube-api-access-6tf9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.540175 4720 generic.go:334] "Generic (PLEG): container finished" podID="03eab9ba-e390-43a8-ab91-b8f0fe8678a0" containerID="566a6d1a5d22263d021363b64e42596b81bd2b7c56316df45c97fc2d4dc05071" exitCode=0 Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.540238 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" event={"ID":"03eab9ba-e390-43a8-ab91-b8f0fe8678a0","Type":"ContainerDied","Data":"566a6d1a5d22263d021363b64e42596b81bd2b7c56316df45c97fc2d4dc05071"} Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.540242 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.540267 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" event={"ID":"03eab9ba-e390-43a8-ab91-b8f0fe8678a0","Type":"ContainerDied","Data":"1a03a4355bd12eae90e463960102d7b8d0f28a5a014b426c9235206feb008d3a"} Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.540282 4720 scope.go:117] "RemoveContainer" containerID="566a6d1a5d22263d021363b64e42596b81bd2b7c56316df45c97fc2d4dc05071" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.544230 4720 generic.go:334] "Generic (PLEG): container finished" podID="e977ca1c-c59f-4c61-8e47-4d03d3b0ced8" containerID="aa3faf8a1f06699eadc416f2c819035756683b2951639a26a35299fd3eb1eeb4" exitCode=0 Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.544278 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" event={"ID":"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8","Type":"ContainerDied","Data":"aa3faf8a1f06699eadc416f2c819035756683b2951639a26a35299fd3eb1eeb4"} Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.544283 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.544308 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" event={"ID":"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8","Type":"ContainerDied","Data":"830f00cd4952a252732ae85fe73bd3c43f95902077b3e9a257094be91b79359d"} Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.557879 4720 scope.go:117] "RemoveContainer" containerID="566a6d1a5d22263d021363b64e42596b81bd2b7c56316df45c97fc2d4dc05071" Jan 21 14:33:12 crc kubenswrapper[4720]: E0121 14:33:12.559014 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"566a6d1a5d22263d021363b64e42596b81bd2b7c56316df45c97fc2d4dc05071\": container with ID starting with 566a6d1a5d22263d021363b64e42596b81bd2b7c56316df45c97fc2d4dc05071 not found: ID does not exist" containerID="566a6d1a5d22263d021363b64e42596b81bd2b7c56316df45c97fc2d4dc05071" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.559050 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"566a6d1a5d22263d021363b64e42596b81bd2b7c56316df45c97fc2d4dc05071"} err="failed to get container status \"566a6d1a5d22263d021363b64e42596b81bd2b7c56316df45c97fc2d4dc05071\": rpc error: code = NotFound desc = could not find container \"566a6d1a5d22263d021363b64e42596b81bd2b7c56316df45c97fc2d4dc05071\": container with ID starting with 566a6d1a5d22263d021363b64e42596b81bd2b7c56316df45c97fc2d4dc05071 not found: ID does not exist" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.559095 4720 scope.go:117] "RemoveContainer" containerID="aa3faf8a1f06699eadc416f2c819035756683b2951639a26a35299fd3eb1eeb4" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.575317 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9gr25"] Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.579977 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9gr25"] Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.587123 4720 scope.go:117] "RemoveContainer" containerID="aa3faf8a1f06699eadc416f2c819035756683b2951639a26a35299fd3eb1eeb4" Jan 21 14:33:12 crc kubenswrapper[4720]: E0121 14:33:12.587586 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa3faf8a1f06699eadc416f2c819035756683b2951639a26a35299fd3eb1eeb4\": container with ID starting with aa3faf8a1f06699eadc416f2c819035756683b2951639a26a35299fd3eb1eeb4 not found: ID does not exist" containerID="aa3faf8a1f06699eadc416f2c819035756683b2951639a26a35299fd3eb1eeb4" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.587673 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa3faf8a1f06699eadc416f2c819035756683b2951639a26a35299fd3eb1eeb4"} err="failed to get container status \"aa3faf8a1f06699eadc416f2c819035756683b2951639a26a35299fd3eb1eeb4\": rpc error: code = NotFound desc = could not find container \"aa3faf8a1f06699eadc416f2c819035756683b2951639a26a35299fd3eb1eeb4\": container with ID starting with aa3faf8a1f06699eadc416f2c819035756683b2951639a26a35299fd3eb1eeb4 not found: ID does not exist" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.589053 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk"] Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.591705 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk"] Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.616821 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tf9q\" (UniqueName: \"kubernetes.io/projected/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-kube-api-access-6tf9q\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.616856 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.616870 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.616885 4720 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.688716 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03eab9ba-e390-43a8-ab91-b8f0fe8678a0" path="/var/lib/kubelet/pods/03eab9ba-e390-43a8-ab91-b8f0fe8678a0/volumes" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.689277 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e977ca1c-c59f-4c61-8e47-4d03d3b0ced8" path="/var/lib/kubelet/pods/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8/volumes" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.170799 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m"] Jan 21 14:33:13 crc kubenswrapper[4720]: E0121 14:33:13.171052 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03eab9ba-e390-43a8-ab91-b8f0fe8678a0" containerName="controller-manager" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.171082 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="03eab9ba-e390-43a8-ab91-b8f0fe8678a0" containerName="controller-manager" Jan 21 14:33:13 crc kubenswrapper[4720]: E0121 14:33:13.171096 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e977ca1c-c59f-4c61-8e47-4d03d3b0ced8" containerName="route-controller-manager" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.171102 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="e977ca1c-c59f-4c61-8e47-4d03d3b0ced8" containerName="route-controller-manager" Jan 21 14:33:13 crc kubenswrapper[4720]: E0121 14:33:13.171112 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.171118 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.171224 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="03eab9ba-e390-43a8-ab91-b8f0fe8678a0" containerName="controller-manager" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.171233 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="e977ca1c-c59f-4c61-8e47-4d03d3b0ced8" containerName="route-controller-manager" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.171239 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.173385 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.180720 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.181064 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.181294 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.181460 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.181677 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.182040 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.186694 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-86cb48757f-q9mqs"] Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.187612 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86cb48757f-q9mqs" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.191010 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.191365 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.191703 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.194886 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.195672 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.200395 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.204172 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-86cb48757f-q9mqs"] Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.208045 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.232225 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m"] Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.328786 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h4nj\" (UniqueName: \"kubernetes.io/projected/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-kube-api-access-9h4nj\") pod \"route-controller-manager-5c6d7f99d9-jgz8m\" (UID: \"cc7f2a3c-bc80-48fe-b417-01789a08fc5f\") " pod="openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.328837 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gchcn\" (UniqueName: \"kubernetes.io/projected/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-kube-api-access-gchcn\") pod \"controller-manager-86cb48757f-q9mqs\" (UID: \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\") " pod="openshift-controller-manager/controller-manager-86cb48757f-q9mqs" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.328865 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-proxy-ca-bundles\") pod \"controller-manager-86cb48757f-q9mqs\" (UID: \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\") " pod="openshift-controller-manager/controller-manager-86cb48757f-q9mqs" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.328889 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-serving-cert\") pod \"controller-manager-86cb48757f-q9mqs\" (UID: \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\") " pod="openshift-controller-manager/controller-manager-86cb48757f-q9mqs" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.329025 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-client-ca\") pod \"route-controller-manager-5c6d7f99d9-jgz8m\" (UID: \"cc7f2a3c-bc80-48fe-b417-01789a08fc5f\") " pod="openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.329116 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-client-ca\") pod \"controller-manager-86cb48757f-q9mqs\" (UID: \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\") " pod="openshift-controller-manager/controller-manager-86cb48757f-q9mqs" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.329199 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-config\") pod \"route-controller-manager-5c6d7f99d9-jgz8m\" (UID: \"cc7f2a3c-bc80-48fe-b417-01789a08fc5f\") " pod="openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.329226 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-config\") pod \"controller-manager-86cb48757f-q9mqs\" (UID: \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\") " pod="openshift-controller-manager/controller-manager-86cb48757f-q9mqs" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.329267 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-serving-cert\") pod \"route-controller-manager-5c6d7f99d9-jgz8m\" (UID: \"cc7f2a3c-bc80-48fe-b417-01789a08fc5f\") " pod="openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.430270 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h4nj\" (UniqueName: \"kubernetes.io/projected/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-kube-api-access-9h4nj\") pod \"route-controller-manager-5c6d7f99d9-jgz8m\" (UID: \"cc7f2a3c-bc80-48fe-b417-01789a08fc5f\") " pod="openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.430327 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gchcn\" (UniqueName: \"kubernetes.io/projected/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-kube-api-access-gchcn\") pod \"controller-manager-86cb48757f-q9mqs\" (UID: \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\") " pod="openshift-controller-manager/controller-manager-86cb48757f-q9mqs" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.430364 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-proxy-ca-bundles\") pod \"controller-manager-86cb48757f-q9mqs\" (UID: \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\") " pod="openshift-controller-manager/controller-manager-86cb48757f-q9mqs" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.430398 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-serving-cert\") pod \"controller-manager-86cb48757f-q9mqs\" (UID: \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\") " pod="openshift-controller-manager/controller-manager-86cb48757f-q9mqs" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.430431 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-client-ca\") pod \"route-controller-manager-5c6d7f99d9-jgz8m\" (UID: \"cc7f2a3c-bc80-48fe-b417-01789a08fc5f\") " pod="openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.430508 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-client-ca\") pod \"controller-manager-86cb48757f-q9mqs\" (UID: \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\") " pod="openshift-controller-manager/controller-manager-86cb48757f-q9mqs" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.430554 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-config\") pod \"route-controller-manager-5c6d7f99d9-jgz8m\" (UID: \"cc7f2a3c-bc80-48fe-b417-01789a08fc5f\") " pod="openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.430580 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-config\") pod \"controller-manager-86cb48757f-q9mqs\" (UID: \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\") " pod="openshift-controller-manager/controller-manager-86cb48757f-q9mqs" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.430608 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-serving-cert\") pod \"route-controller-manager-5c6d7f99d9-jgz8m\" (UID: \"cc7f2a3c-bc80-48fe-b417-01789a08fc5f\") " pod="openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.432253 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-client-ca\") pod \"controller-manager-86cb48757f-q9mqs\" (UID: \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\") " pod="openshift-controller-manager/controller-manager-86cb48757f-q9mqs" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.432284 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-proxy-ca-bundles\") pod \"controller-manager-86cb48757f-q9mqs\" (UID: \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\") " pod="openshift-controller-manager/controller-manager-86cb48757f-q9mqs" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.432643 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-86cb48757f-q9mqs"] Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.432918 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-client-ca\") pod \"route-controller-manager-5c6d7f99d9-jgz8m\" (UID: \"cc7f2a3c-bc80-48fe-b417-01789a08fc5f\") " pod="openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m" Jan 21 14:33:13 crc kubenswrapper[4720]: E0121 14:33:13.432968 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config kube-api-access-gchcn serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-86cb48757f-q9mqs" podUID="4113d218-6a4e-419e-88a3-9f6f22a8dbd5" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.433190 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-config\") pod \"controller-manager-86cb48757f-q9mqs\" (UID: \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\") " pod="openshift-controller-manager/controller-manager-86cb48757f-q9mqs" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.436198 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-serving-cert\") pod \"controller-manager-86cb48757f-q9mqs\" (UID: \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\") " pod="openshift-controller-manager/controller-manager-86cb48757f-q9mqs" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.436942 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-serving-cert\") pod \"route-controller-manager-5c6d7f99d9-jgz8m\" (UID: \"cc7f2a3c-bc80-48fe-b417-01789a08fc5f\") " pod="openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.438681 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-config\") pod \"route-controller-manager-5c6d7f99d9-jgz8m\" (UID: \"cc7f2a3c-bc80-48fe-b417-01789a08fc5f\") " pod="openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.454687 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gchcn\" (UniqueName: \"kubernetes.io/projected/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-kube-api-access-gchcn\") pod \"controller-manager-86cb48757f-q9mqs\" (UID: \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\") " pod="openshift-controller-manager/controller-manager-86cb48757f-q9mqs" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.459552 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m"] Jan 21 14:33:13 crc kubenswrapper[4720]: E0121 14:33:13.460024 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-9h4nj], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m" podUID="cc7f2a3c-bc80-48fe-b417-01789a08fc5f" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.473357 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h4nj\" (UniqueName: \"kubernetes.io/projected/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-kube-api-access-9h4nj\") pod \"route-controller-manager-5c6d7f99d9-jgz8m\" (UID: \"cc7f2a3c-bc80-48fe-b417-01789a08fc5f\") " pod="openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.550225 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86cb48757f-q9mqs" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.550225 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.558759 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86cb48757f-q9mqs" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.564446 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.733933 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-client-ca\") pod \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\" (UID: \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\") " Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.734025 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9h4nj\" (UniqueName: \"kubernetes.io/projected/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-kube-api-access-9h4nj\") pod \"cc7f2a3c-bc80-48fe-b417-01789a08fc5f\" (UID: \"cc7f2a3c-bc80-48fe-b417-01789a08fc5f\") " Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.734050 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gchcn\" (UniqueName: \"kubernetes.io/projected/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-kube-api-access-gchcn\") pod \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\" (UID: \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\") " Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.734075 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-serving-cert\") pod \"cc7f2a3c-bc80-48fe-b417-01789a08fc5f\" (UID: \"cc7f2a3c-bc80-48fe-b417-01789a08fc5f\") " Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.734103 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-proxy-ca-bundles\") pod \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\" (UID: \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\") " Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.734122 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-config\") pod \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\" (UID: \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\") " Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.734149 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-config\") pod \"cc7f2a3c-bc80-48fe-b417-01789a08fc5f\" (UID: \"cc7f2a3c-bc80-48fe-b417-01789a08fc5f\") " Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.734183 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-serving-cert\") pod \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\" (UID: \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\") " Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.734206 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-client-ca\") pod \"cc7f2a3c-bc80-48fe-b417-01789a08fc5f\" (UID: \"cc7f2a3c-bc80-48fe-b417-01789a08fc5f\") " Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.734345 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-client-ca" (OuterVolumeSpecName: "client-ca") pod "4113d218-6a4e-419e-88a3-9f6f22a8dbd5" (UID: "4113d218-6a4e-419e-88a3-9f6f22a8dbd5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.734792 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-config" (OuterVolumeSpecName: "config") pod "cc7f2a3c-bc80-48fe-b417-01789a08fc5f" (UID: "cc7f2a3c-bc80-48fe-b417-01789a08fc5f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.734796 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-config" (OuterVolumeSpecName: "config") pod "4113d218-6a4e-419e-88a3-9f6f22a8dbd5" (UID: "4113d218-6a4e-419e-88a3-9f6f22a8dbd5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.734880 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4113d218-6a4e-419e-88a3-9f6f22a8dbd5" (UID: "4113d218-6a4e-419e-88a3-9f6f22a8dbd5"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.735034 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-client-ca" (OuterVolumeSpecName: "client-ca") pod "cc7f2a3c-bc80-48fe-b417-01789a08fc5f" (UID: "cc7f2a3c-bc80-48fe-b417-01789a08fc5f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.737449 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-kube-api-access-gchcn" (OuterVolumeSpecName: "kube-api-access-gchcn") pod "4113d218-6a4e-419e-88a3-9f6f22a8dbd5" (UID: "4113d218-6a4e-419e-88a3-9f6f22a8dbd5"). InnerVolumeSpecName "kube-api-access-gchcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.738194 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4113d218-6a4e-419e-88a3-9f6f22a8dbd5" (UID: "4113d218-6a4e-419e-88a3-9f6f22a8dbd5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.738754 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cc7f2a3c-bc80-48fe-b417-01789a08fc5f" (UID: "cc7f2a3c-bc80-48fe-b417-01789a08fc5f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.739755 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-kube-api-access-9h4nj" (OuterVolumeSpecName: "kube-api-access-9h4nj") pod "cc7f2a3c-bc80-48fe-b417-01789a08fc5f" (UID: "cc7f2a3c-bc80-48fe-b417-01789a08fc5f"). InnerVolumeSpecName "kube-api-access-9h4nj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.835686 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9h4nj\" (UniqueName: \"kubernetes.io/projected/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-kube-api-access-9h4nj\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.835724 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gchcn\" (UniqueName: \"kubernetes.io/projected/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-kube-api-access-gchcn\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.835736 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.835750 4720 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.835761 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.835775 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.835785 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.835797 4720 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.835807 4720 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:14 crc kubenswrapper[4720]: I0121 14:33:14.554995 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86cb48757f-q9mqs" Jan 21 14:33:14 crc kubenswrapper[4720]: I0121 14:33:14.555010 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m" Jan 21 14:33:14 crc kubenswrapper[4720]: I0121 14:33:14.599870 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-86cb48757f-q9mqs"] Jan 21 14:33:14 crc kubenswrapper[4720]: I0121 14:33:14.609817 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-86cb48757f-q9mqs"] Jan 21 14:33:14 crc kubenswrapper[4720]: I0121 14:33:14.677125 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m"] Jan 21 14:33:14 crc kubenswrapper[4720]: I0121 14:33:14.685553 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4113d218-6a4e-419e-88a3-9f6f22a8dbd5" path="/var/lib/kubelet/pods/4113d218-6a4e-419e-88a3-9f6f22a8dbd5/volumes" Jan 21 14:33:14 crc kubenswrapper[4720]: I0121 14:33:14.686904 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m"] Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.170239 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-54464559b6-jzh6z"] Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.171133 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.174511 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.174827 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.174978 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.175859 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.175907 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.177074 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj"] Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.177548 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.178571 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.184353 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.184494 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.186505 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.186725 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.186889 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.187098 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.198845 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54464559b6-jzh6z"] Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.202982 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.203728 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj"] Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.251402 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7d75fb57-7a86-4641-8f13-4cbcae180901-proxy-ca-bundles\") pod \"controller-manager-54464559b6-jzh6z\" (UID: \"7d75fb57-7a86-4641-8f13-4cbcae180901\") " pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.251742 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d75fb57-7a86-4641-8f13-4cbcae180901-config\") pod \"controller-manager-54464559b6-jzh6z\" (UID: \"7d75fb57-7a86-4641-8f13-4cbcae180901\") " pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.251878 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pghmj\" (UniqueName: \"kubernetes.io/projected/7d75fb57-7a86-4641-8f13-4cbcae180901-kube-api-access-pghmj\") pod \"controller-manager-54464559b6-jzh6z\" (UID: \"7d75fb57-7a86-4641-8f13-4cbcae180901\") " pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.252027 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d75fb57-7a86-4641-8f13-4cbcae180901-serving-cert\") pod \"controller-manager-54464559b6-jzh6z\" (UID: \"7d75fb57-7a86-4641-8f13-4cbcae180901\") " pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.252122 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d75fb57-7a86-4641-8f13-4cbcae180901-client-ca\") pod \"controller-manager-54464559b6-jzh6z\" (UID: \"7d75fb57-7a86-4641-8f13-4cbcae180901\") " pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.353598 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d75fb57-7a86-4641-8f13-4cbcae180901-serving-cert\") pod \"controller-manager-54464559b6-jzh6z\" (UID: \"7d75fb57-7a86-4641-8f13-4cbcae180901\") " pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.354057 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d75fb57-7a86-4641-8f13-4cbcae180901-client-ca\") pod \"controller-manager-54464559b6-jzh6z\" (UID: \"7d75fb57-7a86-4641-8f13-4cbcae180901\") " pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.354387 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/840dfd09-e274-4c2b-9299-a494100e266d-client-ca\") pod \"route-controller-manager-5cf5674648-vlcxj\" (UID: \"840dfd09-e274-4c2b-9299-a494100e266d\") " pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.354540 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8cm7\" (UniqueName: \"kubernetes.io/projected/840dfd09-e274-4c2b-9299-a494100e266d-kube-api-access-m8cm7\") pod \"route-controller-manager-5cf5674648-vlcxj\" (UID: \"840dfd09-e274-4c2b-9299-a494100e266d\") " pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.354687 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7d75fb57-7a86-4641-8f13-4cbcae180901-proxy-ca-bundles\") pod \"controller-manager-54464559b6-jzh6z\" (UID: \"7d75fb57-7a86-4641-8f13-4cbcae180901\") " pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.354810 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/840dfd09-e274-4c2b-9299-a494100e266d-config\") pod \"route-controller-manager-5cf5674648-vlcxj\" (UID: \"840dfd09-e274-4c2b-9299-a494100e266d\") " pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.354959 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d75fb57-7a86-4641-8f13-4cbcae180901-config\") pod \"controller-manager-54464559b6-jzh6z\" (UID: \"7d75fb57-7a86-4641-8f13-4cbcae180901\") " pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.355056 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/840dfd09-e274-4c2b-9299-a494100e266d-serving-cert\") pod \"route-controller-manager-5cf5674648-vlcxj\" (UID: \"840dfd09-e274-4c2b-9299-a494100e266d\") " pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.355161 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pghmj\" (UniqueName: \"kubernetes.io/projected/7d75fb57-7a86-4641-8f13-4cbcae180901-kube-api-access-pghmj\") pod \"controller-manager-54464559b6-jzh6z\" (UID: \"7d75fb57-7a86-4641-8f13-4cbcae180901\") " pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.355189 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d75fb57-7a86-4641-8f13-4cbcae180901-client-ca\") pod \"controller-manager-54464559b6-jzh6z\" (UID: \"7d75fb57-7a86-4641-8f13-4cbcae180901\") " pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.355483 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7d75fb57-7a86-4641-8f13-4cbcae180901-proxy-ca-bundles\") pod \"controller-manager-54464559b6-jzh6z\" (UID: \"7d75fb57-7a86-4641-8f13-4cbcae180901\") " pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.356355 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d75fb57-7a86-4641-8f13-4cbcae180901-config\") pod \"controller-manager-54464559b6-jzh6z\" (UID: \"7d75fb57-7a86-4641-8f13-4cbcae180901\") " pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.357683 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d75fb57-7a86-4641-8f13-4cbcae180901-serving-cert\") pod \"controller-manager-54464559b6-jzh6z\" (UID: \"7d75fb57-7a86-4641-8f13-4cbcae180901\") " pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.378204 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pghmj\" (UniqueName: \"kubernetes.io/projected/7d75fb57-7a86-4641-8f13-4cbcae180901-kube-api-access-pghmj\") pod \"controller-manager-54464559b6-jzh6z\" (UID: \"7d75fb57-7a86-4641-8f13-4cbcae180901\") " pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.456457 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/840dfd09-e274-4c2b-9299-a494100e266d-client-ca\") pod \"route-controller-manager-5cf5674648-vlcxj\" (UID: \"840dfd09-e274-4c2b-9299-a494100e266d\") " pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.456516 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8cm7\" (UniqueName: \"kubernetes.io/projected/840dfd09-e274-4c2b-9299-a494100e266d-kube-api-access-m8cm7\") pod \"route-controller-manager-5cf5674648-vlcxj\" (UID: \"840dfd09-e274-4c2b-9299-a494100e266d\") " pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.456552 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/840dfd09-e274-4c2b-9299-a494100e266d-config\") pod \"route-controller-manager-5cf5674648-vlcxj\" (UID: \"840dfd09-e274-4c2b-9299-a494100e266d\") " pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.456586 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/840dfd09-e274-4c2b-9299-a494100e266d-serving-cert\") pod \"route-controller-manager-5cf5674648-vlcxj\" (UID: \"840dfd09-e274-4c2b-9299-a494100e266d\") " pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.457377 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/840dfd09-e274-4c2b-9299-a494100e266d-client-ca\") pod \"route-controller-manager-5cf5674648-vlcxj\" (UID: \"840dfd09-e274-4c2b-9299-a494100e266d\") " pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.458030 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/840dfd09-e274-4c2b-9299-a494100e266d-config\") pod \"route-controller-manager-5cf5674648-vlcxj\" (UID: \"840dfd09-e274-4c2b-9299-a494100e266d\") " pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.459583 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/840dfd09-e274-4c2b-9299-a494100e266d-serving-cert\") pod \"route-controller-manager-5cf5674648-vlcxj\" (UID: \"840dfd09-e274-4c2b-9299-a494100e266d\") " pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.473591 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8cm7\" (UniqueName: \"kubernetes.io/projected/840dfd09-e274-4c2b-9299-a494100e266d-kube-api-access-m8cm7\") pod \"route-controller-manager-5cf5674648-vlcxj\" (UID: \"840dfd09-e274-4c2b-9299-a494100e266d\") " pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.496584 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.510060 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.666977 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lt46m"] Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.667429 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lt46m" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" containerName="registry-server" containerID="cri-o://a7b59502380a4895dd54770bd6aaf04fdda2243417fd15217dfcddfff65770ae" gracePeriod=2 Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.712562 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj"] Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.753255 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54464559b6-jzh6z"] Jan 21 14:33:15 crc kubenswrapper[4720]: W0121 14:33:15.754307 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d75fb57_7a86_4641_8f13_4cbcae180901.slice/crio-a6080bac1a145ecb3a8ac5bc7db21273027a2c001a70b4341c67dad2f7fc220b WatchSource:0}: Error finding container a6080bac1a145ecb3a8ac5bc7db21273027a2c001a70b4341c67dad2f7fc220b: Status 404 returned error can't find the container with id a6080bac1a145ecb3a8ac5bc7db21273027a2c001a70b4341c67dad2f7fc220b Jan 21 14:33:16 crc kubenswrapper[4720]: E0121 14:33:16.332767 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7b59502380a4895dd54770bd6aaf04fdda2243417fd15217dfcddfff65770ae is running failed: container process not found" containerID="a7b59502380a4895dd54770bd6aaf04fdda2243417fd15217dfcddfff65770ae" cmd=["grpc_health_probe","-addr=:50051"] Jan 21 14:33:16 crc kubenswrapper[4720]: E0121 14:33:16.333354 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7b59502380a4895dd54770bd6aaf04fdda2243417fd15217dfcddfff65770ae is running failed: container process not found" containerID="a7b59502380a4895dd54770bd6aaf04fdda2243417fd15217dfcddfff65770ae" cmd=["grpc_health_probe","-addr=:50051"] Jan 21 14:33:16 crc kubenswrapper[4720]: E0121 14:33:16.333742 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7b59502380a4895dd54770bd6aaf04fdda2243417fd15217dfcddfff65770ae is running failed: container process not found" containerID="a7b59502380a4895dd54770bd6aaf04fdda2243417fd15217dfcddfff65770ae" cmd=["grpc_health_probe","-addr=:50051"] Jan 21 14:33:16 crc kubenswrapper[4720]: E0121 14:33:16.333789 4720 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7b59502380a4895dd54770bd6aaf04fdda2243417fd15217dfcddfff65770ae is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-lt46m" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" containerName="registry-server" Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.570838 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" event={"ID":"840dfd09-e274-4c2b-9299-a494100e266d","Type":"ContainerStarted","Data":"5c99c09d773097bce63e27d370753212cd035db79226b6d7485a1c5439ac7fb3"} Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.570891 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" event={"ID":"840dfd09-e274-4c2b-9299-a494100e266d","Type":"ContainerStarted","Data":"a827d68c41cf6bca1d1353db6d4c691cd0bbcd9fa7fef0db59ccff42a67e61f8"} Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.571063 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.574198 4720 generic.go:334] "Generic (PLEG): container finished" podID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" containerID="a7b59502380a4895dd54770bd6aaf04fdda2243417fd15217dfcddfff65770ae" exitCode=0 Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.574278 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lt46m" event={"ID":"7bb4c793-0d05-43f9-a9ad-30d9b6b40595","Type":"ContainerDied","Data":"a7b59502380a4895dd54770bd6aaf04fdda2243417fd15217dfcddfff65770ae"} Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.574301 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lt46m" event={"ID":"7bb4c793-0d05-43f9-a9ad-30d9b6b40595","Type":"ContainerDied","Data":"328b3e95ade1caeae4e693dd7d243f33f61953dabc84aa7d096915ec1cb9417f"} Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.574312 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="328b3e95ade1caeae4e693dd7d243f33f61953dabc84aa7d096915ec1cb9417f" Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.575765 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" event={"ID":"7d75fb57-7a86-4641-8f13-4cbcae180901","Type":"ContainerStarted","Data":"e5225bde46f6d1977380e310105ba78dc0335e0a200232faef190ccdb9de5fb2"} Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.575790 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" event={"ID":"7d75fb57-7a86-4641-8f13-4cbcae180901","Type":"ContainerStarted","Data":"a6080bac1a145ecb3a8ac5bc7db21273027a2c001a70b4341c67dad2f7fc220b"} Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.577303 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.583075 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.588541 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.603422 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" podStartSLOduration=3.60340771 podStartE2EDuration="3.60340771s" podCreationTimestamp="2026-01-21 14:33:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:33:16.600095106 +0000 UTC m=+234.508835048" watchObservedRunningTime="2026-01-21 14:33:16.60340771 +0000 UTC m=+234.512147632" Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.608147 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lt46m" Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.636630 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" podStartSLOduration=3.636409758 podStartE2EDuration="3.636409758s" podCreationTimestamp="2026-01-21 14:33:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:33:16.631340932 +0000 UTC m=+234.540080864" watchObservedRunningTime="2026-01-21 14:33:16.636409758 +0000 UTC m=+234.545149710" Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.684454 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc7f2a3c-bc80-48fe-b417-01789a08fc5f" path="/var/lib/kubelet/pods/cc7f2a3c-bc80-48fe-b417-01789a08fc5f/volumes" Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.772777 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bb4c793-0d05-43f9-a9ad-30d9b6b40595-utilities\") pod \"7bb4c793-0d05-43f9-a9ad-30d9b6b40595\" (UID: \"7bb4c793-0d05-43f9-a9ad-30d9b6b40595\") " Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.772844 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxcbp\" (UniqueName: \"kubernetes.io/projected/7bb4c793-0d05-43f9-a9ad-30d9b6b40595-kube-api-access-kxcbp\") pod \"7bb4c793-0d05-43f9-a9ad-30d9b6b40595\" (UID: \"7bb4c793-0d05-43f9-a9ad-30d9b6b40595\") " Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.772907 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bb4c793-0d05-43f9-a9ad-30d9b6b40595-catalog-content\") pod \"7bb4c793-0d05-43f9-a9ad-30d9b6b40595\" (UID: \"7bb4c793-0d05-43f9-a9ad-30d9b6b40595\") " Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.773793 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bb4c793-0d05-43f9-a9ad-30d9b6b40595-utilities" (OuterVolumeSpecName: "utilities") pod "7bb4c793-0d05-43f9-a9ad-30d9b6b40595" (UID: "7bb4c793-0d05-43f9-a9ad-30d9b6b40595"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.791080 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb4c793-0d05-43f9-a9ad-30d9b6b40595-kube-api-access-kxcbp" (OuterVolumeSpecName: "kube-api-access-kxcbp") pod "7bb4c793-0d05-43f9-a9ad-30d9b6b40595" (UID: "7bb4c793-0d05-43f9-a9ad-30d9b6b40595"). InnerVolumeSpecName "kube-api-access-kxcbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.864885 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bb4c793-0d05-43f9-a9ad-30d9b6b40595-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7bb4c793-0d05-43f9-a9ad-30d9b6b40595" (UID: "7bb4c793-0d05-43f9-a9ad-30d9b6b40595"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.877508 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bb4c793-0d05-43f9-a9ad-30d9b6b40595-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.877534 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxcbp\" (UniqueName: \"kubernetes.io/projected/7bb4c793-0d05-43f9-a9ad-30d9b6b40595-kube-api-access-kxcbp\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.877544 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bb4c793-0d05-43f9-a9ad-30d9b6b40595-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:17 crc kubenswrapper[4720]: I0121 14:33:17.581559 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lt46m" Jan 21 14:33:17 crc kubenswrapper[4720]: I0121 14:33:17.626286 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lt46m"] Jan 21 14:33:17 crc kubenswrapper[4720]: I0121 14:33:17.643933 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lt46m"] Jan 21 14:33:18 crc kubenswrapper[4720]: I0121 14:33:18.261881 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-52n8k"] Jan 21 14:33:18 crc kubenswrapper[4720]: I0121 14:33:18.262177 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-52n8k" podUID="306f9668-a044-448f-a14f-81c9726d3008" containerName="registry-server" containerID="cri-o://9aac69c47b901a51f1f77aeb2f7ba200d24ee13fa6a5d85f7f3f5f24f22716a3" gracePeriod=2 Jan 21 14:33:18 crc kubenswrapper[4720]: I0121 14:33:18.587123 4720 generic.go:334] "Generic (PLEG): container finished" podID="306f9668-a044-448f-a14f-81c9726d3008" containerID="9aac69c47b901a51f1f77aeb2f7ba200d24ee13fa6a5d85f7f3f5f24f22716a3" exitCode=0 Jan 21 14:33:18 crc kubenswrapper[4720]: I0121 14:33:18.588082 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-52n8k" event={"ID":"306f9668-a044-448f-a14f-81c9726d3008","Type":"ContainerDied","Data":"9aac69c47b901a51f1f77aeb2f7ba200d24ee13fa6a5d85f7f3f5f24f22716a3"} Jan 21 14:33:18 crc kubenswrapper[4720]: I0121 14:33:18.630224 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-52n8k" Jan 21 14:33:18 crc kubenswrapper[4720]: I0121 14:33:18.684151 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" path="/var/lib/kubelet/pods/7bb4c793-0d05-43f9-a9ad-30d9b6b40595/volumes" Jan 21 14:33:18 crc kubenswrapper[4720]: I0121 14:33:18.802416 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4pfg\" (UniqueName: \"kubernetes.io/projected/306f9668-a044-448f-a14f-81c9726d3008-kube-api-access-t4pfg\") pod \"306f9668-a044-448f-a14f-81c9726d3008\" (UID: \"306f9668-a044-448f-a14f-81c9726d3008\") " Jan 21 14:33:18 crc kubenswrapper[4720]: I0121 14:33:18.802543 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/306f9668-a044-448f-a14f-81c9726d3008-utilities\") pod \"306f9668-a044-448f-a14f-81c9726d3008\" (UID: \"306f9668-a044-448f-a14f-81c9726d3008\") " Jan 21 14:33:18 crc kubenswrapper[4720]: I0121 14:33:18.802561 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/306f9668-a044-448f-a14f-81c9726d3008-catalog-content\") pod \"306f9668-a044-448f-a14f-81c9726d3008\" (UID: \"306f9668-a044-448f-a14f-81c9726d3008\") " Jan 21 14:33:18 crc kubenswrapper[4720]: I0121 14:33:18.803487 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/306f9668-a044-448f-a14f-81c9726d3008-utilities" (OuterVolumeSpecName: "utilities") pod "306f9668-a044-448f-a14f-81c9726d3008" (UID: "306f9668-a044-448f-a14f-81c9726d3008"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:33:18 crc kubenswrapper[4720]: I0121 14:33:18.807121 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/306f9668-a044-448f-a14f-81c9726d3008-kube-api-access-t4pfg" (OuterVolumeSpecName: "kube-api-access-t4pfg") pod "306f9668-a044-448f-a14f-81c9726d3008" (UID: "306f9668-a044-448f-a14f-81c9726d3008"). InnerVolumeSpecName "kube-api-access-t4pfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:33:18 crc kubenswrapper[4720]: I0121 14:33:18.905221 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4pfg\" (UniqueName: \"kubernetes.io/projected/306f9668-a044-448f-a14f-81c9726d3008-kube-api-access-t4pfg\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:18 crc kubenswrapper[4720]: I0121 14:33:18.905257 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/306f9668-a044-448f-a14f-81c9726d3008-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:18 crc kubenswrapper[4720]: I0121 14:33:18.913468 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/306f9668-a044-448f-a14f-81c9726d3008-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "306f9668-a044-448f-a14f-81c9726d3008" (UID: "306f9668-a044-448f-a14f-81c9726d3008"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:33:19 crc kubenswrapper[4720]: I0121 14:33:19.007221 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/306f9668-a044-448f-a14f-81c9726d3008-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:19 crc kubenswrapper[4720]: I0121 14:33:19.596810 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-52n8k" event={"ID":"306f9668-a044-448f-a14f-81c9726d3008","Type":"ContainerDied","Data":"26891b408ccd24b0c8434d044528c04f82c156ee44333c5cc05cf38ad2ef94ce"} Jan 21 14:33:19 crc kubenswrapper[4720]: I0121 14:33:19.597260 4720 scope.go:117] "RemoveContainer" containerID="9aac69c47b901a51f1f77aeb2f7ba200d24ee13fa6a5d85f7f3f5f24f22716a3" Jan 21 14:33:19 crc kubenswrapper[4720]: I0121 14:33:19.596888 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-52n8k" Jan 21 14:33:19 crc kubenswrapper[4720]: I0121 14:33:19.633093 4720 scope.go:117] "RemoveContainer" containerID="359803a342c5c510fb51706cab89d859016c20be09a4df27bb7da03e276e9272" Jan 21 14:33:19 crc kubenswrapper[4720]: I0121 14:33:19.635461 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-52n8k"] Jan 21 14:33:19 crc kubenswrapper[4720]: I0121 14:33:19.644074 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-52n8k"] Jan 21 14:33:19 crc kubenswrapper[4720]: I0121 14:33:19.660960 4720 scope.go:117] "RemoveContainer" containerID="23db2e3dd80933444006432f7c28ae6c0623796c99b317cd90c3617bb24ec475" Jan 21 14:33:20 crc kubenswrapper[4720]: I0121 14:33:20.720056 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="306f9668-a044-448f-a14f-81c9726d3008" path="/var/lib/kubelet/pods/306f9668-a044-448f-a14f-81c9726d3008/volumes" Jan 21 14:33:31 crc kubenswrapper[4720]: I0121 14:33:31.535165 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-54464559b6-jzh6z"] Jan 21 14:33:31 crc kubenswrapper[4720]: I0121 14:33:31.535812 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" podUID="7d75fb57-7a86-4641-8f13-4cbcae180901" containerName="controller-manager" containerID="cri-o://e5225bde46f6d1977380e310105ba78dc0335e0a200232faef190ccdb9de5fb2" gracePeriod=30 Jan 21 14:33:32 crc kubenswrapper[4720]: I0121 14:33:32.673188 4720 generic.go:334] "Generic (PLEG): container finished" podID="7d75fb57-7a86-4641-8f13-4cbcae180901" containerID="e5225bde46f6d1977380e310105ba78dc0335e0a200232faef190ccdb9de5fb2" exitCode=0 Jan 21 14:33:32 crc kubenswrapper[4720]: I0121 14:33:32.673492 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" event={"ID":"7d75fb57-7a86-4641-8f13-4cbcae180901","Type":"ContainerDied","Data":"e5225bde46f6d1977380e310105ba78dc0335e0a200232faef190ccdb9de5fb2"} Jan 21 14:33:32 crc kubenswrapper[4720]: I0121 14:33:32.976931 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:32 crc kubenswrapper[4720]: I0121 14:33:32.983031 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d75fb57-7a86-4641-8f13-4cbcae180901-client-ca\") pod \"7d75fb57-7a86-4641-8f13-4cbcae180901\" (UID: \"7d75fb57-7a86-4641-8f13-4cbcae180901\") " Jan 21 14:33:32 crc kubenswrapper[4720]: I0121 14:33:32.983120 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pghmj\" (UniqueName: \"kubernetes.io/projected/7d75fb57-7a86-4641-8f13-4cbcae180901-kube-api-access-pghmj\") pod \"7d75fb57-7a86-4641-8f13-4cbcae180901\" (UID: \"7d75fb57-7a86-4641-8f13-4cbcae180901\") " Jan 21 14:33:32 crc kubenswrapper[4720]: I0121 14:33:32.983214 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d75fb57-7a86-4641-8f13-4cbcae180901-config\") pod \"7d75fb57-7a86-4641-8f13-4cbcae180901\" (UID: \"7d75fb57-7a86-4641-8f13-4cbcae180901\") " Jan 21 14:33:32 crc kubenswrapper[4720]: I0121 14:33:32.983266 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7d75fb57-7a86-4641-8f13-4cbcae180901-proxy-ca-bundles\") pod \"7d75fb57-7a86-4641-8f13-4cbcae180901\" (UID: \"7d75fb57-7a86-4641-8f13-4cbcae180901\") " Jan 21 14:33:32 crc kubenswrapper[4720]: I0121 14:33:32.983335 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d75fb57-7a86-4641-8f13-4cbcae180901-serving-cert\") pod \"7d75fb57-7a86-4641-8f13-4cbcae180901\" (UID: \"7d75fb57-7a86-4641-8f13-4cbcae180901\") " Jan 21 14:33:32 crc kubenswrapper[4720]: I0121 14:33:32.983879 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d75fb57-7a86-4641-8f13-4cbcae180901-client-ca" (OuterVolumeSpecName: "client-ca") pod "7d75fb57-7a86-4641-8f13-4cbcae180901" (UID: "7d75fb57-7a86-4641-8f13-4cbcae180901"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:33:32 crc kubenswrapper[4720]: I0121 14:33:32.984301 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d75fb57-7a86-4641-8f13-4cbcae180901-config" (OuterVolumeSpecName: "config") pod "7d75fb57-7a86-4641-8f13-4cbcae180901" (UID: "7d75fb57-7a86-4641-8f13-4cbcae180901"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:33:32 crc kubenswrapper[4720]: I0121 14:33:32.984558 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d75fb57-7a86-4641-8f13-4cbcae180901-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7d75fb57-7a86-4641-8f13-4cbcae180901" (UID: "7d75fb57-7a86-4641-8f13-4cbcae180901"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:33:32 crc kubenswrapper[4720]: I0121 14:33:32.992537 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d75fb57-7a86-4641-8f13-4cbcae180901-kube-api-access-pghmj" (OuterVolumeSpecName: "kube-api-access-pghmj") pod "7d75fb57-7a86-4641-8f13-4cbcae180901" (UID: "7d75fb57-7a86-4641-8f13-4cbcae180901"). InnerVolumeSpecName "kube-api-access-pghmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:33:32 crc kubenswrapper[4720]: I0121 14:33:32.992846 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d75fb57-7a86-4641-8f13-4cbcae180901-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7d75fb57-7a86-4641-8f13-4cbcae180901" (UID: "7d75fb57-7a86-4641-8f13-4cbcae180901"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.010096 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz"] Jan 21 14:33:33 crc kubenswrapper[4720]: E0121 14:33:33.010380 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" containerName="registry-server" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.010396 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" containerName="registry-server" Jan 21 14:33:33 crc kubenswrapper[4720]: E0121 14:33:33.010409 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="306f9668-a044-448f-a14f-81c9726d3008" containerName="extract-utilities" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.010417 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="306f9668-a044-448f-a14f-81c9726d3008" containerName="extract-utilities" Jan 21 14:33:33 crc kubenswrapper[4720]: E0121 14:33:33.010426 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d75fb57-7a86-4641-8f13-4cbcae180901" containerName="controller-manager" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.010433 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d75fb57-7a86-4641-8f13-4cbcae180901" containerName="controller-manager" Jan 21 14:33:33 crc kubenswrapper[4720]: E0121 14:33:33.010440 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" containerName="extract-content" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.010448 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" containerName="extract-content" Jan 21 14:33:33 crc kubenswrapper[4720]: E0121 14:33:33.010458 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" containerName="extract-utilities" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.010466 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" containerName="extract-utilities" Jan 21 14:33:33 crc kubenswrapper[4720]: E0121 14:33:33.010477 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="306f9668-a044-448f-a14f-81c9726d3008" containerName="extract-content" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.010484 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="306f9668-a044-448f-a14f-81c9726d3008" containerName="extract-content" Jan 21 14:33:33 crc kubenswrapper[4720]: E0121 14:33:33.010496 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="306f9668-a044-448f-a14f-81c9726d3008" containerName="registry-server" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.010502 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="306f9668-a044-448f-a14f-81c9726d3008" containerName="registry-server" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.010647 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d75fb57-7a86-4641-8f13-4cbcae180901" containerName="controller-manager" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.010684 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="306f9668-a044-448f-a14f-81c9726d3008" containerName="registry-server" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.010699 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" containerName="registry-server" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.011162 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.037243 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz"] Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.084022 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vt55\" (UniqueName: \"kubernetes.io/projected/b99b212d-c1f0-4082-a4a4-8e4b657183a9-kube-api-access-8vt55\") pod \"controller-manager-6685c7d6d5-5tgnz\" (UID: \"b99b212d-c1f0-4082-a4a4-8e4b657183a9\") " pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.084075 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b99b212d-c1f0-4082-a4a4-8e4b657183a9-proxy-ca-bundles\") pod \"controller-manager-6685c7d6d5-5tgnz\" (UID: \"b99b212d-c1f0-4082-a4a4-8e4b657183a9\") " pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.084118 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b99b212d-c1f0-4082-a4a4-8e4b657183a9-config\") pod \"controller-manager-6685c7d6d5-5tgnz\" (UID: \"b99b212d-c1f0-4082-a4a4-8e4b657183a9\") " pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.084148 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b99b212d-c1f0-4082-a4a4-8e4b657183a9-client-ca\") pod \"controller-manager-6685c7d6d5-5tgnz\" (UID: \"b99b212d-c1f0-4082-a4a4-8e4b657183a9\") " pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.084168 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b99b212d-c1f0-4082-a4a4-8e4b657183a9-serving-cert\") pod \"controller-manager-6685c7d6d5-5tgnz\" (UID: \"b99b212d-c1f0-4082-a4a4-8e4b657183a9\") " pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.084289 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pghmj\" (UniqueName: \"kubernetes.io/projected/7d75fb57-7a86-4641-8f13-4cbcae180901-kube-api-access-pghmj\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.084305 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d75fb57-7a86-4641-8f13-4cbcae180901-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.084316 4720 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7d75fb57-7a86-4641-8f13-4cbcae180901-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.084326 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d75fb57-7a86-4641-8f13-4cbcae180901-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.084335 4720 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d75fb57-7a86-4641-8f13-4cbcae180901-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.185279 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b99b212d-c1f0-4082-a4a4-8e4b657183a9-client-ca\") pod \"controller-manager-6685c7d6d5-5tgnz\" (UID: \"b99b212d-c1f0-4082-a4a4-8e4b657183a9\") " pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.185328 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b99b212d-c1f0-4082-a4a4-8e4b657183a9-serving-cert\") pod \"controller-manager-6685c7d6d5-5tgnz\" (UID: \"b99b212d-c1f0-4082-a4a4-8e4b657183a9\") " pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.185424 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vt55\" (UniqueName: \"kubernetes.io/projected/b99b212d-c1f0-4082-a4a4-8e4b657183a9-kube-api-access-8vt55\") pod \"controller-manager-6685c7d6d5-5tgnz\" (UID: \"b99b212d-c1f0-4082-a4a4-8e4b657183a9\") " pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.185446 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b99b212d-c1f0-4082-a4a4-8e4b657183a9-proxy-ca-bundles\") pod \"controller-manager-6685c7d6d5-5tgnz\" (UID: \"b99b212d-c1f0-4082-a4a4-8e4b657183a9\") " pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.185468 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b99b212d-c1f0-4082-a4a4-8e4b657183a9-config\") pod \"controller-manager-6685c7d6d5-5tgnz\" (UID: \"b99b212d-c1f0-4082-a4a4-8e4b657183a9\") " pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.186722 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b99b212d-c1f0-4082-a4a4-8e4b657183a9-client-ca\") pod \"controller-manager-6685c7d6d5-5tgnz\" (UID: \"b99b212d-c1f0-4082-a4a4-8e4b657183a9\") " pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.187127 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b99b212d-c1f0-4082-a4a4-8e4b657183a9-config\") pod \"controller-manager-6685c7d6d5-5tgnz\" (UID: \"b99b212d-c1f0-4082-a4a4-8e4b657183a9\") " pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.188132 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b99b212d-c1f0-4082-a4a4-8e4b657183a9-proxy-ca-bundles\") pod \"controller-manager-6685c7d6d5-5tgnz\" (UID: \"b99b212d-c1f0-4082-a4a4-8e4b657183a9\") " pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.199144 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b99b212d-c1f0-4082-a4a4-8e4b657183a9-serving-cert\") pod \"controller-manager-6685c7d6d5-5tgnz\" (UID: \"b99b212d-c1f0-4082-a4a4-8e4b657183a9\") " pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.204073 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vt55\" (UniqueName: \"kubernetes.io/projected/b99b212d-c1f0-4082-a4a4-8e4b657183a9-kube-api-access-8vt55\") pod \"controller-manager-6685c7d6d5-5tgnz\" (UID: \"b99b212d-c1f0-4082-a4a4-8e4b657183a9\") " pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.339007 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.540968 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz"] Jan 21 14:33:33 crc kubenswrapper[4720]: W0121 14:33:33.547129 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb99b212d_c1f0_4082_a4a4_8e4b657183a9.slice/crio-56af3eb69da1bd135159d80389e0f98c7bf7b1c0cc0b8379625c9b2bd29d2747 WatchSource:0}: Error finding container 56af3eb69da1bd135159d80389e0f98c7bf7b1c0cc0b8379625c9b2bd29d2747: Status 404 returned error can't find the container with id 56af3eb69da1bd135159d80389e0f98c7bf7b1c0cc0b8379625c9b2bd29d2747 Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.681390 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" event={"ID":"7d75fb57-7a86-4641-8f13-4cbcae180901","Type":"ContainerDied","Data":"a6080bac1a145ecb3a8ac5bc7db21273027a2c001a70b4341c67dad2f7fc220b"} Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.681503 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.681525 4720 scope.go:117] "RemoveContainer" containerID="e5225bde46f6d1977380e310105ba78dc0335e0a200232faef190ccdb9de5fb2" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.682308 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" event={"ID":"b99b212d-c1f0-4082-a4a4-8e4b657183a9","Type":"ContainerStarted","Data":"56af3eb69da1bd135159d80389e0f98c7bf7b1c0cc0b8379625c9b2bd29d2747"} Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.723016 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-54464559b6-jzh6z"] Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.728498 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-54464559b6-jzh6z"] Jan 21 14:33:34 crc kubenswrapper[4720]: I0121 14:33:34.685132 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d75fb57-7a86-4641-8f13-4cbcae180901" path="/var/lib/kubelet/pods/7d75fb57-7a86-4641-8f13-4cbcae180901/volumes" Jan 21 14:33:34 crc kubenswrapper[4720]: I0121 14:33:34.687333 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" event={"ID":"b99b212d-c1f0-4082-a4a4-8e4b657183a9","Type":"ContainerStarted","Data":"b9f898336d897d4aae76d485dc53f15efa3bbf534209d9b999afa589783f4e53"} Jan 21 14:33:34 crc kubenswrapper[4720]: I0121 14:33:34.687535 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" Jan 21 14:33:34 crc kubenswrapper[4720]: I0121 14:33:34.693317 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" Jan 21 14:33:34 crc kubenswrapper[4720]: I0121 14:33:34.713044 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" podStartSLOduration=3.713020817 podStartE2EDuration="3.713020817s" podCreationTimestamp="2026-01-21 14:33:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:33:34.710259618 +0000 UTC m=+252.618999550" watchObservedRunningTime="2026-01-21 14:33:34.713020817 +0000 UTC m=+252.621760769" Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.652195 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v6vwc"] Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.652976 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-v6vwc" podUID="1d6131a5-b63e-42a5-905a-9ed5350a421a" containerName="registry-server" containerID="cri-o://c0e0fa0c98abdaa51dfb14628a67a65e3156ac9c16cbfadc4748a6699596aa23" gracePeriod=30 Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.664026 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5qbdf"] Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.664453 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5qbdf" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" containerName="registry-server" containerID="cri-o://37feb5de5a4e232cd57fa0f486f5aa5cb3b8174eff24aad08adef76c135bae97" gracePeriod=30 Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.682853 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vxdw2"] Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.687929 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" podUID="90d203a9-910b-471c-afb5-e487b65136ac" containerName="marketplace-operator" containerID="cri-o://d05d82d82af0a06d610676d66e7cb7ca7f7be3c13b938925658fdac6c7476705" gracePeriod=30 Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.698231 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c95rn"] Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.698747 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-c95rn" podUID="8432f9d9-0168-4b49-b6a7-66281f46bd5a" containerName="registry-server" containerID="cri-o://eb2ea66de0c15f757e1ea3f3a7ba866b91eccc8717ebe3d5a9ce44767b9405dc" gracePeriod=30 Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.708958 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x7575"] Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.709609 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-x7575" podUID="328ecaa4-59eb-4707-a320-245636d0c778" containerName="registry-server" containerID="cri-o://0d492e75765297f2fad45ffe41414d6dbafe8642c6bee06687ddf6c8715a19bf" gracePeriod=30 Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.715435 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-s9hd2"] Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.717950 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-s9hd2" Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.723046 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-s9hd2"] Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.800243 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-s9hd2\" (UID: \"fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c\") " pod="openshift-marketplace/marketplace-operator-79b997595-s9hd2" Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.800325 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llnvw\" (UniqueName: \"kubernetes.io/projected/fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c-kube-api-access-llnvw\") pod \"marketplace-operator-79b997595-s9hd2\" (UID: \"fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c\") " pod="openshift-marketplace/marketplace-operator-79b997595-s9hd2" Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.800368 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-s9hd2\" (UID: \"fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c\") " pod="openshift-marketplace/marketplace-operator-79b997595-s9hd2" Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.900940 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llnvw\" (UniqueName: \"kubernetes.io/projected/fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c-kube-api-access-llnvw\") pod \"marketplace-operator-79b997595-s9hd2\" (UID: \"fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c\") " pod="openshift-marketplace/marketplace-operator-79b997595-s9hd2" Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.901010 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-s9hd2\" (UID: \"fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c\") " pod="openshift-marketplace/marketplace-operator-79b997595-s9hd2" Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.901033 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-s9hd2\" (UID: \"fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c\") " pod="openshift-marketplace/marketplace-operator-79b997595-s9hd2" Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.904505 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-s9hd2\" (UID: \"fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c\") " pod="openshift-marketplace/marketplace-operator-79b997595-s9hd2" Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.924480 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-s9hd2\" (UID: \"fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c\") " pod="openshift-marketplace/marketplace-operator-79b997595-s9hd2" Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.926546 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llnvw\" (UniqueName: \"kubernetes.io/projected/fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c-kube-api-access-llnvw\") pod \"marketplace-operator-79b997595-s9hd2\" (UID: \"fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c\") " pod="openshift-marketplace/marketplace-operator-79b997595-s9hd2" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.043041 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-s9hd2" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.306603 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v6vwc" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.315945 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d6131a5-b63e-42a5-905a-9ed5350a421a-catalog-content\") pod \"1d6131a5-b63e-42a5-905a-9ed5350a421a\" (UID: \"1d6131a5-b63e-42a5-905a-9ed5350a421a\") " Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.316030 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmfz4\" (UniqueName: \"kubernetes.io/projected/1d6131a5-b63e-42a5-905a-9ed5350a421a-kube-api-access-dmfz4\") pod \"1d6131a5-b63e-42a5-905a-9ed5350a421a\" (UID: \"1d6131a5-b63e-42a5-905a-9ed5350a421a\") " Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.316055 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d6131a5-b63e-42a5-905a-9ed5350a421a-utilities\") pod \"1d6131a5-b63e-42a5-905a-9ed5350a421a\" (UID: \"1d6131a5-b63e-42a5-905a-9ed5350a421a\") " Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.317091 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d6131a5-b63e-42a5-905a-9ed5350a421a-utilities" (OuterVolumeSpecName: "utilities") pod "1d6131a5-b63e-42a5-905a-9ed5350a421a" (UID: "1d6131a5-b63e-42a5-905a-9ed5350a421a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.329500 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d6131a5-b63e-42a5-905a-9ed5350a421a-kube-api-access-dmfz4" (OuterVolumeSpecName: "kube-api-access-dmfz4") pod "1d6131a5-b63e-42a5-905a-9ed5350a421a" (UID: "1d6131a5-b63e-42a5-905a-9ed5350a421a"). InnerVolumeSpecName "kube-api-access-dmfz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.417400 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmfz4\" (UniqueName: \"kubernetes.io/projected/1d6131a5-b63e-42a5-905a-9ed5350a421a-kube-api-access-dmfz4\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.417431 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d6131a5-b63e-42a5-905a-9ed5350a421a-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.426598 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d6131a5-b63e-42a5-905a-9ed5350a421a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d6131a5-b63e-42a5-905a-9ed5350a421a" (UID: "1d6131a5-b63e-42a5-905a-9ed5350a421a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.445835 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c95rn" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.460133 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5qbdf" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.473947 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x7575" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.518477 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmzpc\" (UniqueName: \"kubernetes.io/projected/328ecaa4-59eb-4707-a320-245636d0c778-kube-api-access-mmzpc\") pod \"328ecaa4-59eb-4707-a320-245636d0c778\" (UID: \"328ecaa4-59eb-4707-a320-245636d0c778\") " Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.518537 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8432f9d9-0168-4b49-b6a7-66281f46bd5a-catalog-content\") pod \"8432f9d9-0168-4b49-b6a7-66281f46bd5a\" (UID: \"8432f9d9-0168-4b49-b6a7-66281f46bd5a\") " Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.518556 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/328ecaa4-59eb-4707-a320-245636d0c778-catalog-content\") pod \"328ecaa4-59eb-4707-a320-245636d0c778\" (UID: \"328ecaa4-59eb-4707-a320-245636d0c778\") " Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.518584 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4szdg\" (UniqueName: \"kubernetes.io/projected/8432f9d9-0168-4b49-b6a7-66281f46bd5a-kube-api-access-4szdg\") pod \"8432f9d9-0168-4b49-b6a7-66281f46bd5a\" (UID: \"8432f9d9-0168-4b49-b6a7-66281f46bd5a\") " Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.518618 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bbb0e48-d287-42fc-a165-86038d2083c9-catalog-content\") pod \"4bbb0e48-d287-42fc-a165-86038d2083c9\" (UID: \"4bbb0e48-d287-42fc-a165-86038d2083c9\") " Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.518677 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bbb0e48-d287-42fc-a165-86038d2083c9-utilities\") pod \"4bbb0e48-d287-42fc-a165-86038d2083c9\" (UID: \"4bbb0e48-d287-42fc-a165-86038d2083c9\") " Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.518735 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/328ecaa4-59eb-4707-a320-245636d0c778-utilities\") pod \"328ecaa4-59eb-4707-a320-245636d0c778\" (UID: \"328ecaa4-59eb-4707-a320-245636d0c778\") " Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.518756 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfn9s\" (UniqueName: \"kubernetes.io/projected/4bbb0e48-d287-42fc-a165-86038d2083c9-kube-api-access-sfn9s\") pod \"4bbb0e48-d287-42fc-a165-86038d2083c9\" (UID: \"4bbb0e48-d287-42fc-a165-86038d2083c9\") " Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.518776 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8432f9d9-0168-4b49-b6a7-66281f46bd5a-utilities\") pod \"8432f9d9-0168-4b49-b6a7-66281f46bd5a\" (UID: \"8432f9d9-0168-4b49-b6a7-66281f46bd5a\") " Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.518973 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d6131a5-b63e-42a5-905a-9ed5350a421a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.520268 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bbb0e48-d287-42fc-a165-86038d2083c9-utilities" (OuterVolumeSpecName: "utilities") pod "4bbb0e48-d287-42fc-a165-86038d2083c9" (UID: "4bbb0e48-d287-42fc-a165-86038d2083c9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.520742 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/328ecaa4-59eb-4707-a320-245636d0c778-utilities" (OuterVolumeSpecName: "utilities") pod "328ecaa4-59eb-4707-a320-245636d0c778" (UID: "328ecaa4-59eb-4707-a320-245636d0c778"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.520823 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8432f9d9-0168-4b49-b6a7-66281f46bd5a-utilities" (OuterVolumeSpecName: "utilities") pod "8432f9d9-0168-4b49-b6a7-66281f46bd5a" (UID: "8432f9d9-0168-4b49-b6a7-66281f46bd5a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.525739 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/328ecaa4-59eb-4707-a320-245636d0c778-kube-api-access-mmzpc" (OuterVolumeSpecName: "kube-api-access-mmzpc") pod "328ecaa4-59eb-4707-a320-245636d0c778" (UID: "328ecaa4-59eb-4707-a320-245636d0c778"). InnerVolumeSpecName "kube-api-access-mmzpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.530143 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8432f9d9-0168-4b49-b6a7-66281f46bd5a-kube-api-access-4szdg" (OuterVolumeSpecName: "kube-api-access-4szdg") pod "8432f9d9-0168-4b49-b6a7-66281f46bd5a" (UID: "8432f9d9-0168-4b49-b6a7-66281f46bd5a"). InnerVolumeSpecName "kube-api-access-4szdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.530503 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bbb0e48-d287-42fc-a165-86038d2083c9-kube-api-access-sfn9s" (OuterVolumeSpecName: "kube-api-access-sfn9s") pod "4bbb0e48-d287-42fc-a165-86038d2083c9" (UID: "4bbb0e48-d287-42fc-a165-86038d2083c9"). InnerVolumeSpecName "kube-api-access-sfn9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.533921 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.550206 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8432f9d9-0168-4b49-b6a7-66281f46bd5a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8432f9d9-0168-4b49-b6a7-66281f46bd5a" (UID: "8432f9d9-0168-4b49-b6a7-66281f46bd5a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.621430 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7dq7\" (UniqueName: \"kubernetes.io/projected/90d203a9-910b-471c-afb5-e487b65136ac-kube-api-access-q7dq7\") pod \"90d203a9-910b-471c-afb5-e487b65136ac\" (UID: \"90d203a9-910b-471c-afb5-e487b65136ac\") " Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.621484 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/90d203a9-910b-471c-afb5-e487b65136ac-marketplace-operator-metrics\") pod \"90d203a9-910b-471c-afb5-e487b65136ac\" (UID: \"90d203a9-910b-471c-afb5-e487b65136ac\") " Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.621511 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/90d203a9-910b-471c-afb5-e487b65136ac-marketplace-trusted-ca\") pod \"90d203a9-910b-471c-afb5-e487b65136ac\" (UID: \"90d203a9-910b-471c-afb5-e487b65136ac\") " Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.621970 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8432f9d9-0168-4b49-b6a7-66281f46bd5a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.621989 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4szdg\" (UniqueName: \"kubernetes.io/projected/8432f9d9-0168-4b49-b6a7-66281f46bd5a-kube-api-access-4szdg\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.622001 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bbb0e48-d287-42fc-a165-86038d2083c9-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.622009 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/328ecaa4-59eb-4707-a320-245636d0c778-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.622018 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfn9s\" (UniqueName: \"kubernetes.io/projected/4bbb0e48-d287-42fc-a165-86038d2083c9-kube-api-access-sfn9s\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.622026 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8432f9d9-0168-4b49-b6a7-66281f46bd5a-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.622035 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmzpc\" (UniqueName: \"kubernetes.io/projected/328ecaa4-59eb-4707-a320-245636d0c778-kube-api-access-mmzpc\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.622451 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90d203a9-910b-471c-afb5-e487b65136ac-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "90d203a9-910b-471c-afb5-e487b65136ac" (UID: "90d203a9-910b-471c-afb5-e487b65136ac"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.624853 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90d203a9-910b-471c-afb5-e487b65136ac-kube-api-access-q7dq7" (OuterVolumeSpecName: "kube-api-access-q7dq7") pod "90d203a9-910b-471c-afb5-e487b65136ac" (UID: "90d203a9-910b-471c-afb5-e487b65136ac"). InnerVolumeSpecName "kube-api-access-q7dq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.624883 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90d203a9-910b-471c-afb5-e487b65136ac-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "90d203a9-910b-471c-afb5-e487b65136ac" (UID: "90d203a9-910b-471c-afb5-e487b65136ac"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.632917 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bbb0e48-d287-42fc-a165-86038d2083c9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4bbb0e48-d287-42fc-a165-86038d2083c9" (UID: "4bbb0e48-d287-42fc-a165-86038d2083c9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.655008 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/328ecaa4-59eb-4707-a320-245636d0c778-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "328ecaa4-59eb-4707-a320-245636d0c778" (UID: "328ecaa4-59eb-4707-a320-245636d0c778"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.665642 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-s9hd2"] Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.723253 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/328ecaa4-59eb-4707-a320-245636d0c778-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.723298 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7dq7\" (UniqueName: \"kubernetes.io/projected/90d203a9-910b-471c-afb5-e487b65136ac-kube-api-access-q7dq7\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.723309 4720 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/90d203a9-910b-471c-afb5-e487b65136ac-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.723318 4720 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/90d203a9-910b-471c-afb5-e487b65136ac-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.723327 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bbb0e48-d287-42fc-a165-86038d2083c9-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.739130 4720 generic.go:334] "Generic (PLEG): container finished" podID="4bbb0e48-d287-42fc-a165-86038d2083c9" containerID="37feb5de5a4e232cd57fa0f486f5aa5cb3b8174eff24aad08adef76c135bae97" exitCode=0 Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.739204 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5qbdf" event={"ID":"4bbb0e48-d287-42fc-a165-86038d2083c9","Type":"ContainerDied","Data":"37feb5de5a4e232cd57fa0f486f5aa5cb3b8174eff24aad08adef76c135bae97"} Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.739231 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5qbdf" event={"ID":"4bbb0e48-d287-42fc-a165-86038d2083c9","Type":"ContainerDied","Data":"85ca11cc33d09ce2c8fd7bab9c3118f3fb41bcc9c4f1e36c585b8c6b04ce1492"} Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.739249 4720 scope.go:117] "RemoveContainer" containerID="37feb5de5a4e232cd57fa0f486f5aa5cb3b8174eff24aad08adef76c135bae97" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.739247 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5qbdf" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.742341 4720 generic.go:334] "Generic (PLEG): container finished" podID="328ecaa4-59eb-4707-a320-245636d0c778" containerID="0d492e75765297f2fad45ffe41414d6dbafe8642c6bee06687ddf6c8715a19bf" exitCode=0 Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.742372 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x7575" event={"ID":"328ecaa4-59eb-4707-a320-245636d0c778","Type":"ContainerDied","Data":"0d492e75765297f2fad45ffe41414d6dbafe8642c6bee06687ddf6c8715a19bf"} Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.742747 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x7575" event={"ID":"328ecaa4-59eb-4707-a320-245636d0c778","Type":"ContainerDied","Data":"b8380163a8adecc8544c40abaa4d48a79fd0c040f667b22d69969e4736058c2d"} Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.742928 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x7575" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.746441 4720 generic.go:334] "Generic (PLEG): container finished" podID="8432f9d9-0168-4b49-b6a7-66281f46bd5a" containerID="eb2ea66de0c15f757e1ea3f3a7ba866b91eccc8717ebe3d5a9ce44767b9405dc" exitCode=0 Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.746493 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c95rn" event={"ID":"8432f9d9-0168-4b49-b6a7-66281f46bd5a","Type":"ContainerDied","Data":"eb2ea66de0c15f757e1ea3f3a7ba866b91eccc8717ebe3d5a9ce44767b9405dc"} Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.746518 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c95rn" event={"ID":"8432f9d9-0168-4b49-b6a7-66281f46bd5a","Type":"ContainerDied","Data":"9c2892b80c1d95c871202545822430a42e2c2316e71ccc122df3bcadd593a956"} Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.746831 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c95rn" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.749233 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-s9hd2" event={"ID":"fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c","Type":"ContainerStarted","Data":"5ef63d867a51c1b59404172d62818837198f493d69b70fdf881163c4bba9bc7d"} Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.750468 4720 generic.go:334] "Generic (PLEG): container finished" podID="90d203a9-910b-471c-afb5-e487b65136ac" containerID="d05d82d82af0a06d610676d66e7cb7ca7f7be3c13b938925658fdac6c7476705" exitCode=0 Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.750557 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" event={"ID":"90d203a9-910b-471c-afb5-e487b65136ac","Type":"ContainerDied","Data":"d05d82d82af0a06d610676d66e7cb7ca7f7be3c13b938925658fdac6c7476705"} Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.750607 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" event={"ID":"90d203a9-910b-471c-afb5-e487b65136ac","Type":"ContainerDied","Data":"617f70e18e4e0f9b72a22ff92ce1fc94aae99827e9d16ba9cde606ce5a9e499c"} Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.750887 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.753072 4720 generic.go:334] "Generic (PLEG): container finished" podID="1d6131a5-b63e-42a5-905a-9ed5350a421a" containerID="c0e0fa0c98abdaa51dfb14628a67a65e3156ac9c16cbfadc4748a6699596aa23" exitCode=0 Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.753130 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v6vwc" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.753130 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v6vwc" event={"ID":"1d6131a5-b63e-42a5-905a-9ed5350a421a","Type":"ContainerDied","Data":"c0e0fa0c98abdaa51dfb14628a67a65e3156ac9c16cbfadc4748a6699596aa23"} Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.754446 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v6vwc" event={"ID":"1d6131a5-b63e-42a5-905a-9ed5350a421a","Type":"ContainerDied","Data":"d3b5cdbc839bad4c3029ff33f78cd38f5b5e460e9963f6c280d92ade619bd510"} Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.769486 4720 scope.go:117] "RemoveContainer" containerID="d6271eba8343a64a510743b92076c0b8cbf3dfa8ac7bd3c494ffad17e390a604" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.780396 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5qbdf"] Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.793139 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5qbdf"] Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.831340 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x7575"] Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.832268 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-x7575"] Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.841632 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c95rn"] Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.846627 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-c95rn"] Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.850172 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v6vwc"] Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.853089 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-v6vwc"] Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.853557 4720 scope.go:117] "RemoveContainer" containerID="913d1830ceed6ae40bbb4c04398f1f327e8c16bbd8735fc74ac413d3ad20a3f6" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.856788 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vxdw2"] Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.864987 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vxdw2"] Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.866806 4720 scope.go:117] "RemoveContainer" containerID="37feb5de5a4e232cd57fa0f486f5aa5cb3b8174eff24aad08adef76c135bae97" Jan 21 14:33:42 crc kubenswrapper[4720]: E0121 14:33:42.867224 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37feb5de5a4e232cd57fa0f486f5aa5cb3b8174eff24aad08adef76c135bae97\": container with ID starting with 37feb5de5a4e232cd57fa0f486f5aa5cb3b8174eff24aad08adef76c135bae97 not found: ID does not exist" containerID="37feb5de5a4e232cd57fa0f486f5aa5cb3b8174eff24aad08adef76c135bae97" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.867253 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37feb5de5a4e232cd57fa0f486f5aa5cb3b8174eff24aad08adef76c135bae97"} err="failed to get container status \"37feb5de5a4e232cd57fa0f486f5aa5cb3b8174eff24aad08adef76c135bae97\": rpc error: code = NotFound desc = could not find container \"37feb5de5a4e232cd57fa0f486f5aa5cb3b8174eff24aad08adef76c135bae97\": container with ID starting with 37feb5de5a4e232cd57fa0f486f5aa5cb3b8174eff24aad08adef76c135bae97 not found: ID does not exist" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.867273 4720 scope.go:117] "RemoveContainer" containerID="d6271eba8343a64a510743b92076c0b8cbf3dfa8ac7bd3c494ffad17e390a604" Jan 21 14:33:42 crc kubenswrapper[4720]: E0121 14:33:42.867447 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6271eba8343a64a510743b92076c0b8cbf3dfa8ac7bd3c494ffad17e390a604\": container with ID starting with d6271eba8343a64a510743b92076c0b8cbf3dfa8ac7bd3c494ffad17e390a604 not found: ID does not exist" containerID="d6271eba8343a64a510743b92076c0b8cbf3dfa8ac7bd3c494ffad17e390a604" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.867467 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6271eba8343a64a510743b92076c0b8cbf3dfa8ac7bd3c494ffad17e390a604"} err="failed to get container status \"d6271eba8343a64a510743b92076c0b8cbf3dfa8ac7bd3c494ffad17e390a604\": rpc error: code = NotFound desc = could not find container \"d6271eba8343a64a510743b92076c0b8cbf3dfa8ac7bd3c494ffad17e390a604\": container with ID starting with d6271eba8343a64a510743b92076c0b8cbf3dfa8ac7bd3c494ffad17e390a604 not found: ID does not exist" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.867480 4720 scope.go:117] "RemoveContainer" containerID="913d1830ceed6ae40bbb4c04398f1f327e8c16bbd8735fc74ac413d3ad20a3f6" Jan 21 14:33:42 crc kubenswrapper[4720]: E0121 14:33:42.868125 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"913d1830ceed6ae40bbb4c04398f1f327e8c16bbd8735fc74ac413d3ad20a3f6\": container with ID starting with 913d1830ceed6ae40bbb4c04398f1f327e8c16bbd8735fc74ac413d3ad20a3f6 not found: ID does not exist" containerID="913d1830ceed6ae40bbb4c04398f1f327e8c16bbd8735fc74ac413d3ad20a3f6" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.868146 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"913d1830ceed6ae40bbb4c04398f1f327e8c16bbd8735fc74ac413d3ad20a3f6"} err="failed to get container status \"913d1830ceed6ae40bbb4c04398f1f327e8c16bbd8735fc74ac413d3ad20a3f6\": rpc error: code = NotFound desc = could not find container \"913d1830ceed6ae40bbb4c04398f1f327e8c16bbd8735fc74ac413d3ad20a3f6\": container with ID starting with 913d1830ceed6ae40bbb4c04398f1f327e8c16bbd8735fc74ac413d3ad20a3f6 not found: ID does not exist" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.868158 4720 scope.go:117] "RemoveContainer" containerID="0d492e75765297f2fad45ffe41414d6dbafe8642c6bee06687ddf6c8715a19bf" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.881593 4720 scope.go:117] "RemoveContainer" containerID="ea8b6d61a3c6cf5240b3168f98e11a3daa5ea4b4ec1019ccba81cc77ae7c8c62" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.899466 4720 scope.go:117] "RemoveContainer" containerID="ffff4cc4dd421fc3d140565ccec23e5e6c9e5bcc82c5cbbe3391d78e0a095744" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.920580 4720 scope.go:117] "RemoveContainer" containerID="0d492e75765297f2fad45ffe41414d6dbafe8642c6bee06687ddf6c8715a19bf" Jan 21 14:33:42 crc kubenswrapper[4720]: E0121 14:33:42.920848 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d492e75765297f2fad45ffe41414d6dbafe8642c6bee06687ddf6c8715a19bf\": container with ID starting with 0d492e75765297f2fad45ffe41414d6dbafe8642c6bee06687ddf6c8715a19bf not found: ID does not exist" containerID="0d492e75765297f2fad45ffe41414d6dbafe8642c6bee06687ddf6c8715a19bf" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.920963 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d492e75765297f2fad45ffe41414d6dbafe8642c6bee06687ddf6c8715a19bf"} err="failed to get container status \"0d492e75765297f2fad45ffe41414d6dbafe8642c6bee06687ddf6c8715a19bf\": rpc error: code = NotFound desc = could not find container \"0d492e75765297f2fad45ffe41414d6dbafe8642c6bee06687ddf6c8715a19bf\": container with ID starting with 0d492e75765297f2fad45ffe41414d6dbafe8642c6bee06687ddf6c8715a19bf not found: ID does not exist" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.921038 4720 scope.go:117] "RemoveContainer" containerID="ea8b6d61a3c6cf5240b3168f98e11a3daa5ea4b4ec1019ccba81cc77ae7c8c62" Jan 21 14:33:42 crc kubenswrapper[4720]: E0121 14:33:42.921734 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea8b6d61a3c6cf5240b3168f98e11a3daa5ea4b4ec1019ccba81cc77ae7c8c62\": container with ID starting with ea8b6d61a3c6cf5240b3168f98e11a3daa5ea4b4ec1019ccba81cc77ae7c8c62 not found: ID does not exist" containerID="ea8b6d61a3c6cf5240b3168f98e11a3daa5ea4b4ec1019ccba81cc77ae7c8c62" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.921841 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea8b6d61a3c6cf5240b3168f98e11a3daa5ea4b4ec1019ccba81cc77ae7c8c62"} err="failed to get container status \"ea8b6d61a3c6cf5240b3168f98e11a3daa5ea4b4ec1019ccba81cc77ae7c8c62\": rpc error: code = NotFound desc = could not find container \"ea8b6d61a3c6cf5240b3168f98e11a3daa5ea4b4ec1019ccba81cc77ae7c8c62\": container with ID starting with ea8b6d61a3c6cf5240b3168f98e11a3daa5ea4b4ec1019ccba81cc77ae7c8c62 not found: ID does not exist" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.921908 4720 scope.go:117] "RemoveContainer" containerID="ffff4cc4dd421fc3d140565ccec23e5e6c9e5bcc82c5cbbe3391d78e0a095744" Jan 21 14:33:42 crc kubenswrapper[4720]: E0121 14:33:42.922194 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffff4cc4dd421fc3d140565ccec23e5e6c9e5bcc82c5cbbe3391d78e0a095744\": container with ID starting with ffff4cc4dd421fc3d140565ccec23e5e6c9e5bcc82c5cbbe3391d78e0a095744 not found: ID does not exist" containerID="ffff4cc4dd421fc3d140565ccec23e5e6c9e5bcc82c5cbbe3391d78e0a095744" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.922271 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffff4cc4dd421fc3d140565ccec23e5e6c9e5bcc82c5cbbe3391d78e0a095744"} err="failed to get container status \"ffff4cc4dd421fc3d140565ccec23e5e6c9e5bcc82c5cbbe3391d78e0a095744\": rpc error: code = NotFound desc = could not find container \"ffff4cc4dd421fc3d140565ccec23e5e6c9e5bcc82c5cbbe3391d78e0a095744\": container with ID starting with ffff4cc4dd421fc3d140565ccec23e5e6c9e5bcc82c5cbbe3391d78e0a095744 not found: ID does not exist" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.922341 4720 scope.go:117] "RemoveContainer" containerID="eb2ea66de0c15f757e1ea3f3a7ba866b91eccc8717ebe3d5a9ce44767b9405dc" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.943340 4720 scope.go:117] "RemoveContainer" containerID="0c8a7ca936259535e52d9f1f75585d2ac601a4266b11a725dd5c872f792d1b98" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.956432 4720 scope.go:117] "RemoveContainer" containerID="aff1c70dad44e4a9235659b4d3aa767982940cb15a22e40293f61a1cba5b043d" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.974145 4720 scope.go:117] "RemoveContainer" containerID="eb2ea66de0c15f757e1ea3f3a7ba866b91eccc8717ebe3d5a9ce44767b9405dc" Jan 21 14:33:42 crc kubenswrapper[4720]: E0121 14:33:42.974641 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb2ea66de0c15f757e1ea3f3a7ba866b91eccc8717ebe3d5a9ce44767b9405dc\": container with ID starting with eb2ea66de0c15f757e1ea3f3a7ba866b91eccc8717ebe3d5a9ce44767b9405dc not found: ID does not exist" containerID="eb2ea66de0c15f757e1ea3f3a7ba866b91eccc8717ebe3d5a9ce44767b9405dc" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.974693 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb2ea66de0c15f757e1ea3f3a7ba866b91eccc8717ebe3d5a9ce44767b9405dc"} err="failed to get container status \"eb2ea66de0c15f757e1ea3f3a7ba866b91eccc8717ebe3d5a9ce44767b9405dc\": rpc error: code = NotFound desc = could not find container \"eb2ea66de0c15f757e1ea3f3a7ba866b91eccc8717ebe3d5a9ce44767b9405dc\": container with ID starting with eb2ea66de0c15f757e1ea3f3a7ba866b91eccc8717ebe3d5a9ce44767b9405dc not found: ID does not exist" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.974718 4720 scope.go:117] "RemoveContainer" containerID="0c8a7ca936259535e52d9f1f75585d2ac601a4266b11a725dd5c872f792d1b98" Jan 21 14:33:42 crc kubenswrapper[4720]: E0121 14:33:42.975219 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c8a7ca936259535e52d9f1f75585d2ac601a4266b11a725dd5c872f792d1b98\": container with ID starting with 0c8a7ca936259535e52d9f1f75585d2ac601a4266b11a725dd5c872f792d1b98 not found: ID does not exist" containerID="0c8a7ca936259535e52d9f1f75585d2ac601a4266b11a725dd5c872f792d1b98" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.975248 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c8a7ca936259535e52d9f1f75585d2ac601a4266b11a725dd5c872f792d1b98"} err="failed to get container status \"0c8a7ca936259535e52d9f1f75585d2ac601a4266b11a725dd5c872f792d1b98\": rpc error: code = NotFound desc = could not find container \"0c8a7ca936259535e52d9f1f75585d2ac601a4266b11a725dd5c872f792d1b98\": container with ID starting with 0c8a7ca936259535e52d9f1f75585d2ac601a4266b11a725dd5c872f792d1b98 not found: ID does not exist" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.975267 4720 scope.go:117] "RemoveContainer" containerID="aff1c70dad44e4a9235659b4d3aa767982940cb15a22e40293f61a1cba5b043d" Jan 21 14:33:42 crc kubenswrapper[4720]: E0121 14:33:42.975519 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aff1c70dad44e4a9235659b4d3aa767982940cb15a22e40293f61a1cba5b043d\": container with ID starting with aff1c70dad44e4a9235659b4d3aa767982940cb15a22e40293f61a1cba5b043d not found: ID does not exist" containerID="aff1c70dad44e4a9235659b4d3aa767982940cb15a22e40293f61a1cba5b043d" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.975597 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aff1c70dad44e4a9235659b4d3aa767982940cb15a22e40293f61a1cba5b043d"} err="failed to get container status \"aff1c70dad44e4a9235659b4d3aa767982940cb15a22e40293f61a1cba5b043d\": rpc error: code = NotFound desc = could not find container \"aff1c70dad44e4a9235659b4d3aa767982940cb15a22e40293f61a1cba5b043d\": container with ID starting with aff1c70dad44e4a9235659b4d3aa767982940cb15a22e40293f61a1cba5b043d not found: ID does not exist" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.975685 4720 scope.go:117] "RemoveContainer" containerID="d05d82d82af0a06d610676d66e7cb7ca7f7be3c13b938925658fdac6c7476705" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.992837 4720 scope.go:117] "RemoveContainer" containerID="d05d82d82af0a06d610676d66e7cb7ca7f7be3c13b938925658fdac6c7476705" Jan 21 14:33:42 crc kubenswrapper[4720]: E0121 14:33:42.993273 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d05d82d82af0a06d610676d66e7cb7ca7f7be3c13b938925658fdac6c7476705\": container with ID starting with d05d82d82af0a06d610676d66e7cb7ca7f7be3c13b938925658fdac6c7476705 not found: ID does not exist" containerID="d05d82d82af0a06d610676d66e7cb7ca7f7be3c13b938925658fdac6c7476705" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.993315 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d05d82d82af0a06d610676d66e7cb7ca7f7be3c13b938925658fdac6c7476705"} err="failed to get container status \"d05d82d82af0a06d610676d66e7cb7ca7f7be3c13b938925658fdac6c7476705\": rpc error: code = NotFound desc = could not find container \"d05d82d82af0a06d610676d66e7cb7ca7f7be3c13b938925658fdac6c7476705\": container with ID starting with d05d82d82af0a06d610676d66e7cb7ca7f7be3c13b938925658fdac6c7476705 not found: ID does not exist" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.993343 4720 scope.go:117] "RemoveContainer" containerID="c0e0fa0c98abdaa51dfb14628a67a65e3156ac9c16cbfadc4748a6699596aa23" Jan 21 14:33:43 crc kubenswrapper[4720]: I0121 14:33:43.009139 4720 scope.go:117] "RemoveContainer" containerID="de24d7572be1ab2e8506e6c1275ce9f8ebc0d3feb9e5d97e950e10f251941efb" Jan 21 14:33:43 crc kubenswrapper[4720]: I0121 14:33:43.032340 4720 scope.go:117] "RemoveContainer" containerID="30172f5a091dbf43920d3bc422d548a7d74dec60e89e6b9d22bca8e36b6c2ed1" Jan 21 14:33:43 crc kubenswrapper[4720]: I0121 14:33:43.048495 4720 scope.go:117] "RemoveContainer" containerID="c0e0fa0c98abdaa51dfb14628a67a65e3156ac9c16cbfadc4748a6699596aa23" Jan 21 14:33:43 crc kubenswrapper[4720]: E0121 14:33:43.048895 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0e0fa0c98abdaa51dfb14628a67a65e3156ac9c16cbfadc4748a6699596aa23\": container with ID starting with c0e0fa0c98abdaa51dfb14628a67a65e3156ac9c16cbfadc4748a6699596aa23 not found: ID does not exist" containerID="c0e0fa0c98abdaa51dfb14628a67a65e3156ac9c16cbfadc4748a6699596aa23" Jan 21 14:33:43 crc kubenswrapper[4720]: I0121 14:33:43.048967 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0e0fa0c98abdaa51dfb14628a67a65e3156ac9c16cbfadc4748a6699596aa23"} err="failed to get container status \"c0e0fa0c98abdaa51dfb14628a67a65e3156ac9c16cbfadc4748a6699596aa23\": rpc error: code = NotFound desc = could not find container \"c0e0fa0c98abdaa51dfb14628a67a65e3156ac9c16cbfadc4748a6699596aa23\": container with ID starting with c0e0fa0c98abdaa51dfb14628a67a65e3156ac9c16cbfadc4748a6699596aa23 not found: ID does not exist" Jan 21 14:33:43 crc kubenswrapper[4720]: I0121 14:33:43.049017 4720 scope.go:117] "RemoveContainer" containerID="de24d7572be1ab2e8506e6c1275ce9f8ebc0d3feb9e5d97e950e10f251941efb" Jan 21 14:33:43 crc kubenswrapper[4720]: E0121 14:33:43.049376 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de24d7572be1ab2e8506e6c1275ce9f8ebc0d3feb9e5d97e950e10f251941efb\": container with ID starting with de24d7572be1ab2e8506e6c1275ce9f8ebc0d3feb9e5d97e950e10f251941efb not found: ID does not exist" containerID="de24d7572be1ab2e8506e6c1275ce9f8ebc0d3feb9e5d97e950e10f251941efb" Jan 21 14:33:43 crc kubenswrapper[4720]: I0121 14:33:43.049401 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de24d7572be1ab2e8506e6c1275ce9f8ebc0d3feb9e5d97e950e10f251941efb"} err="failed to get container status \"de24d7572be1ab2e8506e6c1275ce9f8ebc0d3feb9e5d97e950e10f251941efb\": rpc error: code = NotFound desc = could not find container \"de24d7572be1ab2e8506e6c1275ce9f8ebc0d3feb9e5d97e950e10f251941efb\": container with ID starting with de24d7572be1ab2e8506e6c1275ce9f8ebc0d3feb9e5d97e950e10f251941efb not found: ID does not exist" Jan 21 14:33:43 crc kubenswrapper[4720]: I0121 14:33:43.049418 4720 scope.go:117] "RemoveContainer" containerID="30172f5a091dbf43920d3bc422d548a7d74dec60e89e6b9d22bca8e36b6c2ed1" Jan 21 14:33:43 crc kubenswrapper[4720]: E0121 14:33:43.049768 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30172f5a091dbf43920d3bc422d548a7d74dec60e89e6b9d22bca8e36b6c2ed1\": container with ID starting with 30172f5a091dbf43920d3bc422d548a7d74dec60e89e6b9d22bca8e36b6c2ed1 not found: ID does not exist" containerID="30172f5a091dbf43920d3bc422d548a7d74dec60e89e6b9d22bca8e36b6c2ed1" Jan 21 14:33:43 crc kubenswrapper[4720]: I0121 14:33:43.049813 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30172f5a091dbf43920d3bc422d548a7d74dec60e89e6b9d22bca8e36b6c2ed1"} err="failed to get container status \"30172f5a091dbf43920d3bc422d548a7d74dec60e89e6b9d22bca8e36b6c2ed1\": rpc error: code = NotFound desc = could not find container \"30172f5a091dbf43920d3bc422d548a7d74dec60e89e6b9d22bca8e36b6c2ed1\": container with ID starting with 30172f5a091dbf43920d3bc422d548a7d74dec60e89e6b9d22bca8e36b6c2ed1 not found: ID does not exist" Jan 21 14:33:43 crc kubenswrapper[4720]: I0121 14:33:43.760825 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-s9hd2" event={"ID":"fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c","Type":"ContainerStarted","Data":"94d24540bb35c6fa830ff15b1e745d19fa1bc384917d0d58afcd9bc6efd8f3ad"} Jan 21 14:33:43 crc kubenswrapper[4720]: I0121 14:33:43.761754 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-s9hd2" Jan 21 14:33:43 crc kubenswrapper[4720]: I0121 14:33:43.763737 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-s9hd2" Jan 21 14:33:43 crc kubenswrapper[4720]: I0121 14:33:43.803811 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-s9hd2" podStartSLOduration=2.803789497 podStartE2EDuration="2.803789497s" podCreationTimestamp="2026-01-21 14:33:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:33:43.784871794 +0000 UTC m=+261.693611726" watchObservedRunningTime="2026-01-21 14:33:43.803789497 +0000 UTC m=+261.712529439" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.287607 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7fb4w"] Jan 21 14:33:44 crc kubenswrapper[4720]: E0121 14:33:44.288053 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90d203a9-910b-471c-afb5-e487b65136ac" containerName="marketplace-operator" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.288068 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="90d203a9-910b-471c-afb5-e487b65136ac" containerName="marketplace-operator" Jan 21 14:33:44 crc kubenswrapper[4720]: E0121 14:33:44.288083 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d6131a5-b63e-42a5-905a-9ed5350a421a" containerName="extract-utilities" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.288091 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d6131a5-b63e-42a5-905a-9ed5350a421a" containerName="extract-utilities" Jan 21 14:33:44 crc kubenswrapper[4720]: E0121 14:33:44.288106 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8432f9d9-0168-4b49-b6a7-66281f46bd5a" containerName="extract-utilities" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.288114 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="8432f9d9-0168-4b49-b6a7-66281f46bd5a" containerName="extract-utilities" Jan 21 14:33:44 crc kubenswrapper[4720]: E0121 14:33:44.288125 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="328ecaa4-59eb-4707-a320-245636d0c778" containerName="extract-content" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.288135 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="328ecaa4-59eb-4707-a320-245636d0c778" containerName="extract-content" Jan 21 14:33:44 crc kubenswrapper[4720]: E0121 14:33:44.288152 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8432f9d9-0168-4b49-b6a7-66281f46bd5a" containerName="registry-server" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.288162 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="8432f9d9-0168-4b49-b6a7-66281f46bd5a" containerName="registry-server" Jan 21 14:33:44 crc kubenswrapper[4720]: E0121 14:33:44.288173 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" containerName="extract-content" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.288181 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" containerName="extract-content" Jan 21 14:33:44 crc kubenswrapper[4720]: E0121 14:33:44.288194 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" containerName="registry-server" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.288201 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" containerName="registry-server" Jan 21 14:33:44 crc kubenswrapper[4720]: E0121 14:33:44.288216 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d6131a5-b63e-42a5-905a-9ed5350a421a" containerName="extract-content" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.288224 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d6131a5-b63e-42a5-905a-9ed5350a421a" containerName="extract-content" Jan 21 14:33:44 crc kubenswrapper[4720]: E0121 14:33:44.288235 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d6131a5-b63e-42a5-905a-9ed5350a421a" containerName="registry-server" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.288243 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d6131a5-b63e-42a5-905a-9ed5350a421a" containerName="registry-server" Jan 21 14:33:44 crc kubenswrapper[4720]: E0121 14:33:44.288254 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8432f9d9-0168-4b49-b6a7-66281f46bd5a" containerName="extract-content" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.288262 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="8432f9d9-0168-4b49-b6a7-66281f46bd5a" containerName="extract-content" Jan 21 14:33:44 crc kubenswrapper[4720]: E0121 14:33:44.288271 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="328ecaa4-59eb-4707-a320-245636d0c778" containerName="extract-utilities" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.288279 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="328ecaa4-59eb-4707-a320-245636d0c778" containerName="extract-utilities" Jan 21 14:33:44 crc kubenswrapper[4720]: E0121 14:33:44.288318 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="328ecaa4-59eb-4707-a320-245636d0c778" containerName="registry-server" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.288326 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="328ecaa4-59eb-4707-a320-245636d0c778" containerName="registry-server" Jan 21 14:33:44 crc kubenswrapper[4720]: E0121 14:33:44.288340 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" containerName="extract-utilities" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.288348 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" containerName="extract-utilities" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.288467 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="8432f9d9-0168-4b49-b6a7-66281f46bd5a" containerName="registry-server" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.288480 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" containerName="registry-server" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.288492 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="90d203a9-910b-471c-afb5-e487b65136ac" containerName="marketplace-operator" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.288502 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d6131a5-b63e-42a5-905a-9ed5350a421a" containerName="registry-server" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.288519 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="328ecaa4-59eb-4707-a320-245636d0c778" containerName="registry-server" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.289344 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7fb4w" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.292801 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.304903 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7fb4w"] Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.345702 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcffq\" (UniqueName: \"kubernetes.io/projected/1f47a635-f04f-4002-a264-f10be8c70e10-kube-api-access-zcffq\") pod \"redhat-marketplace-7fb4w\" (UID: \"1f47a635-f04f-4002-a264-f10be8c70e10\") " pod="openshift-marketplace/redhat-marketplace-7fb4w" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.345771 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f47a635-f04f-4002-a264-f10be8c70e10-catalog-content\") pod \"redhat-marketplace-7fb4w\" (UID: \"1f47a635-f04f-4002-a264-f10be8c70e10\") " pod="openshift-marketplace/redhat-marketplace-7fb4w" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.345921 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f47a635-f04f-4002-a264-f10be8c70e10-utilities\") pod \"redhat-marketplace-7fb4w\" (UID: \"1f47a635-f04f-4002-a264-f10be8c70e10\") " pod="openshift-marketplace/redhat-marketplace-7fb4w" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.447299 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f47a635-f04f-4002-a264-f10be8c70e10-utilities\") pod \"redhat-marketplace-7fb4w\" (UID: \"1f47a635-f04f-4002-a264-f10be8c70e10\") " pod="openshift-marketplace/redhat-marketplace-7fb4w" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.447377 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcffq\" (UniqueName: \"kubernetes.io/projected/1f47a635-f04f-4002-a264-f10be8c70e10-kube-api-access-zcffq\") pod \"redhat-marketplace-7fb4w\" (UID: \"1f47a635-f04f-4002-a264-f10be8c70e10\") " pod="openshift-marketplace/redhat-marketplace-7fb4w" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.447414 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f47a635-f04f-4002-a264-f10be8c70e10-catalog-content\") pod \"redhat-marketplace-7fb4w\" (UID: \"1f47a635-f04f-4002-a264-f10be8c70e10\") " pod="openshift-marketplace/redhat-marketplace-7fb4w" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.447913 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f47a635-f04f-4002-a264-f10be8c70e10-utilities\") pod \"redhat-marketplace-7fb4w\" (UID: \"1f47a635-f04f-4002-a264-f10be8c70e10\") " pod="openshift-marketplace/redhat-marketplace-7fb4w" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.447977 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f47a635-f04f-4002-a264-f10be8c70e10-catalog-content\") pod \"redhat-marketplace-7fb4w\" (UID: \"1f47a635-f04f-4002-a264-f10be8c70e10\") " pod="openshift-marketplace/redhat-marketplace-7fb4w" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.466265 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcffq\" (UniqueName: \"kubernetes.io/projected/1f47a635-f04f-4002-a264-f10be8c70e10-kube-api-access-zcffq\") pod \"redhat-marketplace-7fb4w\" (UID: \"1f47a635-f04f-4002-a264-f10be8c70e10\") " pod="openshift-marketplace/redhat-marketplace-7fb4w" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.603609 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7fb4w" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.695120 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d6131a5-b63e-42a5-905a-9ed5350a421a" path="/var/lib/kubelet/pods/1d6131a5-b63e-42a5-905a-9ed5350a421a/volumes" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.696199 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="328ecaa4-59eb-4707-a320-245636d0c778" path="/var/lib/kubelet/pods/328ecaa4-59eb-4707-a320-245636d0c778/volumes" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.701088 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" path="/var/lib/kubelet/pods/4bbb0e48-d287-42fc-a165-86038d2083c9/volumes" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.702365 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8432f9d9-0168-4b49-b6a7-66281f46bd5a" path="/var/lib/kubelet/pods/8432f9d9-0168-4b49-b6a7-66281f46bd5a/volumes" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.707958 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90d203a9-910b-471c-afb5-e487b65136ac" path="/var/lib/kubelet/pods/90d203a9-910b-471c-afb5-e487b65136ac/volumes" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.892634 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4hxc8"] Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.894034 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4hxc8" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.897327 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.915410 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4hxc8"] Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.954009 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph265\" (UniqueName: \"kubernetes.io/projected/86ba467d-dfbe-493b-acf6-17b938a753b0-kube-api-access-ph265\") pod \"redhat-operators-4hxc8\" (UID: \"86ba467d-dfbe-493b-acf6-17b938a753b0\") " pod="openshift-marketplace/redhat-operators-4hxc8" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.954075 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86ba467d-dfbe-493b-acf6-17b938a753b0-catalog-content\") pod \"redhat-operators-4hxc8\" (UID: \"86ba467d-dfbe-493b-acf6-17b938a753b0\") " pod="openshift-marketplace/redhat-operators-4hxc8" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.954151 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86ba467d-dfbe-493b-acf6-17b938a753b0-utilities\") pod \"redhat-operators-4hxc8\" (UID: \"86ba467d-dfbe-493b-acf6-17b938a753b0\") " pod="openshift-marketplace/redhat-operators-4hxc8" Jan 21 14:33:45 crc kubenswrapper[4720]: I0121 14:33:45.017558 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7fb4w"] Jan 21 14:33:45 crc kubenswrapper[4720]: W0121 14:33:45.025252 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f47a635_f04f_4002_a264_f10be8c70e10.slice/crio-a1afdaf85bb82edbc95f4c81b12a315744320806ab04c6d8dd48f83c26bcb075 WatchSource:0}: Error finding container a1afdaf85bb82edbc95f4c81b12a315744320806ab04c6d8dd48f83c26bcb075: Status 404 returned error can't find the container with id a1afdaf85bb82edbc95f4c81b12a315744320806ab04c6d8dd48f83c26bcb075 Jan 21 14:33:45 crc kubenswrapper[4720]: I0121 14:33:45.054722 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86ba467d-dfbe-493b-acf6-17b938a753b0-catalog-content\") pod \"redhat-operators-4hxc8\" (UID: \"86ba467d-dfbe-493b-acf6-17b938a753b0\") " pod="openshift-marketplace/redhat-operators-4hxc8" Jan 21 14:33:45 crc kubenswrapper[4720]: I0121 14:33:45.055118 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86ba467d-dfbe-493b-acf6-17b938a753b0-catalog-content\") pod \"redhat-operators-4hxc8\" (UID: \"86ba467d-dfbe-493b-acf6-17b938a753b0\") " pod="openshift-marketplace/redhat-operators-4hxc8" Jan 21 14:33:45 crc kubenswrapper[4720]: I0121 14:33:45.055246 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86ba467d-dfbe-493b-acf6-17b938a753b0-utilities\") pod \"redhat-operators-4hxc8\" (UID: \"86ba467d-dfbe-493b-acf6-17b938a753b0\") " pod="openshift-marketplace/redhat-operators-4hxc8" Jan 21 14:33:45 crc kubenswrapper[4720]: I0121 14:33:45.055491 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86ba467d-dfbe-493b-acf6-17b938a753b0-utilities\") pod \"redhat-operators-4hxc8\" (UID: \"86ba467d-dfbe-493b-acf6-17b938a753b0\") " pod="openshift-marketplace/redhat-operators-4hxc8" Jan 21 14:33:45 crc kubenswrapper[4720]: I0121 14:33:45.055541 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph265\" (UniqueName: \"kubernetes.io/projected/86ba467d-dfbe-493b-acf6-17b938a753b0-kube-api-access-ph265\") pod \"redhat-operators-4hxc8\" (UID: \"86ba467d-dfbe-493b-acf6-17b938a753b0\") " pod="openshift-marketplace/redhat-operators-4hxc8" Jan 21 14:33:45 crc kubenswrapper[4720]: I0121 14:33:45.074395 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph265\" (UniqueName: \"kubernetes.io/projected/86ba467d-dfbe-493b-acf6-17b938a753b0-kube-api-access-ph265\") pod \"redhat-operators-4hxc8\" (UID: \"86ba467d-dfbe-493b-acf6-17b938a753b0\") " pod="openshift-marketplace/redhat-operators-4hxc8" Jan 21 14:33:45 crc kubenswrapper[4720]: I0121 14:33:45.211055 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4hxc8" Jan 21 14:33:45 crc kubenswrapper[4720]: I0121 14:33:45.587715 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4hxc8"] Jan 21 14:33:45 crc kubenswrapper[4720]: W0121 14:33:45.590923 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86ba467d_dfbe_493b_acf6_17b938a753b0.slice/crio-a4d607da7df13b8e3893f0d3f0dfdbc6a1d96158509d17718eec429d147f2623 WatchSource:0}: Error finding container a4d607da7df13b8e3893f0d3f0dfdbc6a1d96158509d17718eec429d147f2623: Status 404 returned error can't find the container with id a4d607da7df13b8e3893f0d3f0dfdbc6a1d96158509d17718eec429d147f2623 Jan 21 14:33:45 crc kubenswrapper[4720]: I0121 14:33:45.776417 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7fb4w" event={"ID":"1f47a635-f04f-4002-a264-f10be8c70e10","Type":"ContainerStarted","Data":"a1afdaf85bb82edbc95f4c81b12a315744320806ab04c6d8dd48f83c26bcb075"} Jan 21 14:33:45 crc kubenswrapper[4720]: I0121 14:33:45.778102 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4hxc8" event={"ID":"86ba467d-dfbe-493b-acf6-17b938a753b0","Type":"ContainerStarted","Data":"a4d607da7df13b8e3893f0d3f0dfdbc6a1d96158509d17718eec429d147f2623"} Jan 21 14:33:46 crc kubenswrapper[4720]: I0121 14:33:46.680752 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kb2c7"] Jan 21 14:33:46 crc kubenswrapper[4720]: I0121 14:33:46.682076 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kb2c7" Jan 21 14:33:46 crc kubenswrapper[4720]: I0121 14:33:46.687032 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 21 14:33:46 crc kubenswrapper[4720]: I0121 14:33:46.699068 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kb2c7"] Jan 21 14:33:46 crc kubenswrapper[4720]: I0121 14:33:46.777395 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9a5b258-9d31-4031-85f0-1c8d00da3dda-catalog-content\") pod \"certified-operators-kb2c7\" (UID: \"c9a5b258-9d31-4031-85f0-1c8d00da3dda\") " pod="openshift-marketplace/certified-operators-kb2c7" Jan 21 14:33:46 crc kubenswrapper[4720]: I0121 14:33:46.777507 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9a5b258-9d31-4031-85f0-1c8d00da3dda-utilities\") pod \"certified-operators-kb2c7\" (UID: \"c9a5b258-9d31-4031-85f0-1c8d00da3dda\") " pod="openshift-marketplace/certified-operators-kb2c7" Jan 21 14:33:46 crc kubenswrapper[4720]: I0121 14:33:46.777535 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfcfg\" (UniqueName: \"kubernetes.io/projected/c9a5b258-9d31-4031-85f0-1c8d00da3dda-kube-api-access-rfcfg\") pod \"certified-operators-kb2c7\" (UID: \"c9a5b258-9d31-4031-85f0-1c8d00da3dda\") " pod="openshift-marketplace/certified-operators-kb2c7" Jan 21 14:33:46 crc kubenswrapper[4720]: I0121 14:33:46.793822 4720 generic.go:334] "Generic (PLEG): container finished" podID="1f47a635-f04f-4002-a264-f10be8c70e10" containerID="9c2b1c72fedf2b087f889500a8bf7249fcb8c582f96a5b18b72fb4e06dd0c998" exitCode=0 Jan 21 14:33:46 crc kubenswrapper[4720]: I0121 14:33:46.793935 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7fb4w" event={"ID":"1f47a635-f04f-4002-a264-f10be8c70e10","Type":"ContainerDied","Data":"9c2b1c72fedf2b087f889500a8bf7249fcb8c582f96a5b18b72fb4e06dd0c998"} Jan 21 14:33:46 crc kubenswrapper[4720]: I0121 14:33:46.795756 4720 generic.go:334] "Generic (PLEG): container finished" podID="86ba467d-dfbe-493b-acf6-17b938a753b0" containerID="78463a7970d148b08439482a23b9dba952d67553710c3a6d71b9d262255a9e61" exitCode=0 Jan 21 14:33:46 crc kubenswrapper[4720]: I0121 14:33:46.795805 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4hxc8" event={"ID":"86ba467d-dfbe-493b-acf6-17b938a753b0","Type":"ContainerDied","Data":"78463a7970d148b08439482a23b9dba952d67553710c3a6d71b9d262255a9e61"} Jan 21 14:33:46 crc kubenswrapper[4720]: I0121 14:33:46.878138 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9a5b258-9d31-4031-85f0-1c8d00da3dda-utilities\") pod \"certified-operators-kb2c7\" (UID: \"c9a5b258-9d31-4031-85f0-1c8d00da3dda\") " pod="openshift-marketplace/certified-operators-kb2c7" Jan 21 14:33:46 crc kubenswrapper[4720]: I0121 14:33:46.878187 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfcfg\" (UniqueName: \"kubernetes.io/projected/c9a5b258-9d31-4031-85f0-1c8d00da3dda-kube-api-access-rfcfg\") pod \"certified-operators-kb2c7\" (UID: \"c9a5b258-9d31-4031-85f0-1c8d00da3dda\") " pod="openshift-marketplace/certified-operators-kb2c7" Jan 21 14:33:46 crc kubenswrapper[4720]: I0121 14:33:46.878222 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9a5b258-9d31-4031-85f0-1c8d00da3dda-catalog-content\") pod \"certified-operators-kb2c7\" (UID: \"c9a5b258-9d31-4031-85f0-1c8d00da3dda\") " pod="openshift-marketplace/certified-operators-kb2c7" Jan 21 14:33:46 crc kubenswrapper[4720]: I0121 14:33:46.878792 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9a5b258-9d31-4031-85f0-1c8d00da3dda-catalog-content\") pod \"certified-operators-kb2c7\" (UID: \"c9a5b258-9d31-4031-85f0-1c8d00da3dda\") " pod="openshift-marketplace/certified-operators-kb2c7" Jan 21 14:33:46 crc kubenswrapper[4720]: I0121 14:33:46.878921 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9a5b258-9d31-4031-85f0-1c8d00da3dda-utilities\") pod \"certified-operators-kb2c7\" (UID: \"c9a5b258-9d31-4031-85f0-1c8d00da3dda\") " pod="openshift-marketplace/certified-operators-kb2c7" Jan 21 14:33:46 crc kubenswrapper[4720]: I0121 14:33:46.908419 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfcfg\" (UniqueName: \"kubernetes.io/projected/c9a5b258-9d31-4031-85f0-1c8d00da3dda-kube-api-access-rfcfg\") pod \"certified-operators-kb2c7\" (UID: \"c9a5b258-9d31-4031-85f0-1c8d00da3dda\") " pod="openshift-marketplace/certified-operators-kb2c7" Jan 21 14:33:47 crc kubenswrapper[4720]: I0121 14:33:47.006558 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kb2c7" Jan 21 14:33:47 crc kubenswrapper[4720]: I0121 14:33:47.280634 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bqrkw"] Jan 21 14:33:47 crc kubenswrapper[4720]: I0121 14:33:47.288063 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bqrkw" Jan 21 14:33:47 crc kubenswrapper[4720]: I0121 14:33:47.289758 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bqrkw"] Jan 21 14:33:47 crc kubenswrapper[4720]: I0121 14:33:47.289840 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 21 14:33:47 crc kubenswrapper[4720]: I0121 14:33:47.385629 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc9kv\" (UniqueName: \"kubernetes.io/projected/f9a3c893-2903-4355-9af3-b8f981477494-kube-api-access-sc9kv\") pod \"community-operators-bqrkw\" (UID: \"f9a3c893-2903-4355-9af3-b8f981477494\") " pod="openshift-marketplace/community-operators-bqrkw" Jan 21 14:33:47 crc kubenswrapper[4720]: I0121 14:33:47.385700 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9a3c893-2903-4355-9af3-b8f981477494-utilities\") pod \"community-operators-bqrkw\" (UID: \"f9a3c893-2903-4355-9af3-b8f981477494\") " pod="openshift-marketplace/community-operators-bqrkw" Jan 21 14:33:47 crc kubenswrapper[4720]: I0121 14:33:47.385821 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9a3c893-2903-4355-9af3-b8f981477494-catalog-content\") pod \"community-operators-bqrkw\" (UID: \"f9a3c893-2903-4355-9af3-b8f981477494\") " pod="openshift-marketplace/community-operators-bqrkw" Jan 21 14:33:47 crc kubenswrapper[4720]: I0121 14:33:47.436689 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kb2c7"] Jan 21 14:33:47 crc kubenswrapper[4720]: W0121 14:33:47.440059 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9a5b258_9d31_4031_85f0_1c8d00da3dda.slice/crio-802779e03cb8d5a94886d5052622d2f225df3c745fbe9ca0d9b7f323d0685420 WatchSource:0}: Error finding container 802779e03cb8d5a94886d5052622d2f225df3c745fbe9ca0d9b7f323d0685420: Status 404 returned error can't find the container with id 802779e03cb8d5a94886d5052622d2f225df3c745fbe9ca0d9b7f323d0685420 Jan 21 14:33:47 crc kubenswrapper[4720]: I0121 14:33:47.486851 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc9kv\" (UniqueName: \"kubernetes.io/projected/f9a3c893-2903-4355-9af3-b8f981477494-kube-api-access-sc9kv\") pod \"community-operators-bqrkw\" (UID: \"f9a3c893-2903-4355-9af3-b8f981477494\") " pod="openshift-marketplace/community-operators-bqrkw" Jan 21 14:33:47 crc kubenswrapper[4720]: I0121 14:33:47.486898 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9a3c893-2903-4355-9af3-b8f981477494-utilities\") pod \"community-operators-bqrkw\" (UID: \"f9a3c893-2903-4355-9af3-b8f981477494\") " pod="openshift-marketplace/community-operators-bqrkw" Jan 21 14:33:47 crc kubenswrapper[4720]: I0121 14:33:47.486927 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9a3c893-2903-4355-9af3-b8f981477494-catalog-content\") pod \"community-operators-bqrkw\" (UID: \"f9a3c893-2903-4355-9af3-b8f981477494\") " pod="openshift-marketplace/community-operators-bqrkw" Jan 21 14:33:47 crc kubenswrapper[4720]: I0121 14:33:47.487290 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9a3c893-2903-4355-9af3-b8f981477494-catalog-content\") pod \"community-operators-bqrkw\" (UID: \"f9a3c893-2903-4355-9af3-b8f981477494\") " pod="openshift-marketplace/community-operators-bqrkw" Jan 21 14:33:47 crc kubenswrapper[4720]: I0121 14:33:47.487429 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9a3c893-2903-4355-9af3-b8f981477494-utilities\") pod \"community-operators-bqrkw\" (UID: \"f9a3c893-2903-4355-9af3-b8f981477494\") " pod="openshift-marketplace/community-operators-bqrkw" Jan 21 14:33:47 crc kubenswrapper[4720]: I0121 14:33:47.505148 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc9kv\" (UniqueName: \"kubernetes.io/projected/f9a3c893-2903-4355-9af3-b8f981477494-kube-api-access-sc9kv\") pod \"community-operators-bqrkw\" (UID: \"f9a3c893-2903-4355-9af3-b8f981477494\") " pod="openshift-marketplace/community-operators-bqrkw" Jan 21 14:33:47 crc kubenswrapper[4720]: I0121 14:33:47.607721 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bqrkw" Jan 21 14:33:47 crc kubenswrapper[4720]: I0121 14:33:47.801524 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kb2c7" event={"ID":"c9a5b258-9d31-4031-85f0-1c8d00da3dda","Type":"ContainerStarted","Data":"802779e03cb8d5a94886d5052622d2f225df3c745fbe9ca0d9b7f323d0685420"} Jan 21 14:33:47 crc kubenswrapper[4720]: I0121 14:33:47.973384 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bqrkw"] Jan 21 14:33:47 crc kubenswrapper[4720]: W0121 14:33:47.981878 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9a3c893_2903_4355_9af3_b8f981477494.slice/crio-248f1de11ffc02b8afd1183a8141c21d72a9db3ac8c65433f9f98da11bc419f3 WatchSource:0}: Error finding container 248f1de11ffc02b8afd1183a8141c21d72a9db3ac8c65433f9f98da11bc419f3: Status 404 returned error can't find the container with id 248f1de11ffc02b8afd1183a8141c21d72a9db3ac8c65433f9f98da11bc419f3 Jan 21 14:33:48 crc kubenswrapper[4720]: I0121 14:33:48.807413 4720 generic.go:334] "Generic (PLEG): container finished" podID="c9a5b258-9d31-4031-85f0-1c8d00da3dda" containerID="43eba3433cb18996557abdfca43416ddb338165d69b1ca200a34d85ce638dbbb" exitCode=0 Jan 21 14:33:48 crc kubenswrapper[4720]: I0121 14:33:48.807671 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kb2c7" event={"ID":"c9a5b258-9d31-4031-85f0-1c8d00da3dda","Type":"ContainerDied","Data":"43eba3433cb18996557abdfca43416ddb338165d69b1ca200a34d85ce638dbbb"} Jan 21 14:33:48 crc kubenswrapper[4720]: I0121 14:33:48.809311 4720 generic.go:334] "Generic (PLEG): container finished" podID="f9a3c893-2903-4355-9af3-b8f981477494" containerID="72b66af0c5d4d88f0ff56206b6cd5a927d24dfc13eb460c55f2bbe2c7c2bb175" exitCode=0 Jan 21 14:33:48 crc kubenswrapper[4720]: I0121 14:33:48.809349 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bqrkw" event={"ID":"f9a3c893-2903-4355-9af3-b8f981477494","Type":"ContainerDied","Data":"72b66af0c5d4d88f0ff56206b6cd5a927d24dfc13eb460c55f2bbe2c7c2bb175"} Jan 21 14:33:48 crc kubenswrapper[4720]: I0121 14:33:48.809439 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bqrkw" event={"ID":"f9a3c893-2903-4355-9af3-b8f981477494","Type":"ContainerStarted","Data":"248f1de11ffc02b8afd1183a8141c21d72a9db3ac8c65433f9f98da11bc419f3"} Jan 21 14:33:49 crc kubenswrapper[4720]: I0121 14:33:49.815868 4720 generic.go:334] "Generic (PLEG): container finished" podID="1f47a635-f04f-4002-a264-f10be8c70e10" containerID="bde9b617e649c39bfcf92df7b4beae07bc25e3961c9a5ee920a6300015135379" exitCode=0 Jan 21 14:33:49 crc kubenswrapper[4720]: I0121 14:33:49.815958 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7fb4w" event={"ID":"1f47a635-f04f-4002-a264-f10be8c70e10","Type":"ContainerDied","Data":"bde9b617e649c39bfcf92df7b4beae07bc25e3961c9a5ee920a6300015135379"} Jan 21 14:33:49 crc kubenswrapper[4720]: I0121 14:33:49.819164 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4hxc8" event={"ID":"86ba467d-dfbe-493b-acf6-17b938a753b0","Type":"ContainerStarted","Data":"7ca7f8a5c20e3bddacbc12f16d445aca464719b6c925391c8b06c96fdd022163"} Jan 21 14:33:51 crc kubenswrapper[4720]: I0121 14:33:51.829627 4720 generic.go:334] "Generic (PLEG): container finished" podID="86ba467d-dfbe-493b-acf6-17b938a753b0" containerID="7ca7f8a5c20e3bddacbc12f16d445aca464719b6c925391c8b06c96fdd022163" exitCode=0 Jan 21 14:33:51 crc kubenswrapper[4720]: I0121 14:33:51.829707 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4hxc8" event={"ID":"86ba467d-dfbe-493b-acf6-17b938a753b0","Type":"ContainerDied","Data":"7ca7f8a5c20e3bddacbc12f16d445aca464719b6c925391c8b06c96fdd022163"} Jan 21 14:33:52 crc kubenswrapper[4720]: I0121 14:33:52.836986 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bqrkw" event={"ID":"f9a3c893-2903-4355-9af3-b8f981477494","Type":"ContainerStarted","Data":"3cf94886364cef91e20ae9016e04659436eabd5438c6457787eb48cc78c05d42"} Jan 21 14:33:52 crc kubenswrapper[4720]: I0121 14:33:52.839727 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7fb4w" event={"ID":"1f47a635-f04f-4002-a264-f10be8c70e10","Type":"ContainerStarted","Data":"4c38f4240a11f753005e29df2dcef41e087ace821f7289909af6860f0f7ff948"} Jan 21 14:33:52 crc kubenswrapper[4720]: I0121 14:33:52.842273 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kb2c7" event={"ID":"c9a5b258-9d31-4031-85f0-1c8d00da3dda","Type":"ContainerStarted","Data":"c6d0ea8c2e2121a74778b256a77f4b032d4f796cda4fbfab99f77a84e3288124"} Jan 21 14:33:52 crc kubenswrapper[4720]: I0121 14:33:52.848919 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4hxc8" event={"ID":"86ba467d-dfbe-493b-acf6-17b938a753b0","Type":"ContainerStarted","Data":"458c08bd42934ef6f7bfe4dffa6066112a7cc0d9e15db613da2b5ef519eca59a"} Jan 21 14:33:52 crc kubenswrapper[4720]: I0121 14:33:52.913772 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4hxc8" podStartSLOduration=3.50444645 podStartE2EDuration="8.913755524s" podCreationTimestamp="2026-01-21 14:33:44 +0000 UTC" firstStartedPulling="2026-01-21 14:33:46.79716586 +0000 UTC m=+264.705905832" lastFinishedPulling="2026-01-21 14:33:52.206474964 +0000 UTC m=+270.115214906" observedRunningTime="2026-01-21 14:33:52.88989287 +0000 UTC m=+270.798632802" watchObservedRunningTime="2026-01-21 14:33:52.913755524 +0000 UTC m=+270.822495466" Jan 21 14:33:53 crc kubenswrapper[4720]: I0121 14:33:53.856808 4720 generic.go:334] "Generic (PLEG): container finished" podID="c9a5b258-9d31-4031-85f0-1c8d00da3dda" containerID="c6d0ea8c2e2121a74778b256a77f4b032d4f796cda4fbfab99f77a84e3288124" exitCode=0 Jan 21 14:33:53 crc kubenswrapper[4720]: I0121 14:33:53.856899 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kb2c7" event={"ID":"c9a5b258-9d31-4031-85f0-1c8d00da3dda","Type":"ContainerDied","Data":"c6d0ea8c2e2121a74778b256a77f4b032d4f796cda4fbfab99f77a84e3288124"} Jan 21 14:33:53 crc kubenswrapper[4720]: I0121 14:33:53.860593 4720 generic.go:334] "Generic (PLEG): container finished" podID="f9a3c893-2903-4355-9af3-b8f981477494" containerID="3cf94886364cef91e20ae9016e04659436eabd5438c6457787eb48cc78c05d42" exitCode=0 Jan 21 14:33:53 crc kubenswrapper[4720]: I0121 14:33:53.860640 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bqrkw" event={"ID":"f9a3c893-2903-4355-9af3-b8f981477494","Type":"ContainerDied","Data":"3cf94886364cef91e20ae9016e04659436eabd5438c6457787eb48cc78c05d42"} Jan 21 14:33:53 crc kubenswrapper[4720]: I0121 14:33:53.907875 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7fb4w" podStartSLOduration=5.491724824 podStartE2EDuration="9.907856339s" podCreationTimestamp="2026-01-21 14:33:44 +0000 UTC" firstStartedPulling="2026-01-21 14:33:46.796412579 +0000 UTC m=+264.705152551" lastFinishedPulling="2026-01-21 14:33:51.212544134 +0000 UTC m=+269.121284066" observedRunningTime="2026-01-21 14:33:52.934838048 +0000 UTC m=+270.843577990" watchObservedRunningTime="2026-01-21 14:33:53.907856339 +0000 UTC m=+271.816596271" Jan 21 14:33:54 crc kubenswrapper[4720]: I0121 14:33:54.604533 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7fb4w" Jan 21 14:33:54 crc kubenswrapper[4720]: I0121 14:33:54.605121 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7fb4w" Jan 21 14:33:54 crc kubenswrapper[4720]: I0121 14:33:54.638961 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7fb4w" Jan 21 14:33:55 crc kubenswrapper[4720]: I0121 14:33:55.211401 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4hxc8" Jan 21 14:33:55 crc kubenswrapper[4720]: I0121 14:33:55.211521 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4hxc8" Jan 21 14:33:56 crc kubenswrapper[4720]: I0121 14:33:56.251030 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4hxc8" podUID="86ba467d-dfbe-493b-acf6-17b938a753b0" containerName="registry-server" probeResult="failure" output=< Jan 21 14:33:56 crc kubenswrapper[4720]: timeout: failed to connect service ":50051" within 1s Jan 21 14:33:56 crc kubenswrapper[4720]: > Jan 21 14:33:56 crc kubenswrapper[4720]: I0121 14:33:56.880389 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kb2c7" event={"ID":"c9a5b258-9d31-4031-85f0-1c8d00da3dda","Type":"ContainerStarted","Data":"e090310451bfa0ea474a10a9ee80aac36797337db2c5a79361cb32bef9c0d9aa"} Jan 21 14:33:56 crc kubenswrapper[4720]: I0121 14:33:56.882358 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bqrkw" event={"ID":"f9a3c893-2903-4355-9af3-b8f981477494","Type":"ContainerStarted","Data":"803babe81199b98919d30ecc8f9d07b7ebe605b7beb05ae97585d5105fad8b7f"} Jan 21 14:33:56 crc kubenswrapper[4720]: I0121 14:33:56.942530 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kb2c7" podStartSLOduration=3.928977957 podStartE2EDuration="10.942507093s" podCreationTimestamp="2026-01-21 14:33:46 +0000 UTC" firstStartedPulling="2026-01-21 14:33:48.823671277 +0000 UTC m=+266.732411199" lastFinishedPulling="2026-01-21 14:33:55.837200403 +0000 UTC m=+273.745940335" observedRunningTime="2026-01-21 14:33:56.908044506 +0000 UTC m=+274.816784458" watchObservedRunningTime="2026-01-21 14:33:56.942507093 +0000 UTC m=+274.851247035" Jan 21 14:33:57 crc kubenswrapper[4720]: I0121 14:33:57.007710 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kb2c7" Jan 21 14:33:57 crc kubenswrapper[4720]: I0121 14:33:57.007769 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kb2c7" Jan 21 14:33:57 crc kubenswrapper[4720]: I0121 14:33:57.607923 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bqrkw" Jan 21 14:33:57 crc kubenswrapper[4720]: I0121 14:33:57.607974 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bqrkw" Jan 21 14:33:58 crc kubenswrapper[4720]: I0121 14:33:58.051122 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-kb2c7" podUID="c9a5b258-9d31-4031-85f0-1c8d00da3dda" containerName="registry-server" probeResult="failure" output=< Jan 21 14:33:58 crc kubenswrapper[4720]: timeout: failed to connect service ":50051" within 1s Jan 21 14:33:58 crc kubenswrapper[4720]: > Jan 21 14:33:58 crc kubenswrapper[4720]: I0121 14:33:58.643384 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-bqrkw" podUID="f9a3c893-2903-4355-9af3-b8f981477494" containerName="registry-server" probeResult="failure" output=< Jan 21 14:33:58 crc kubenswrapper[4720]: timeout: failed to connect service ":50051" within 1s Jan 21 14:33:58 crc kubenswrapper[4720]: > Jan 21 14:34:04 crc kubenswrapper[4720]: I0121 14:34:04.648783 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7fb4w" Jan 21 14:34:04 crc kubenswrapper[4720]: I0121 14:34:04.667384 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bqrkw" podStartSLOduration=10.546034401 podStartE2EDuration="17.667364344s" podCreationTimestamp="2026-01-21 14:33:47 +0000 UTC" firstStartedPulling="2026-01-21 14:33:48.823806822 +0000 UTC m=+266.732546754" lastFinishedPulling="2026-01-21 14:33:55.945136765 +0000 UTC m=+273.853876697" observedRunningTime="2026-01-21 14:33:56.944300624 +0000 UTC m=+274.853040576" watchObservedRunningTime="2026-01-21 14:34:04.667364344 +0000 UTC m=+282.576104276" Jan 21 14:34:05 crc kubenswrapper[4720]: I0121 14:34:05.249458 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4hxc8" Jan 21 14:34:05 crc kubenswrapper[4720]: I0121 14:34:05.295361 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4hxc8" Jan 21 14:34:07 crc kubenswrapper[4720]: I0121 14:34:07.050956 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kb2c7" Jan 21 14:34:08 crc kubenswrapper[4720]: I0121 14:34:07.100405 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kb2c7" Jan 21 14:34:08 crc kubenswrapper[4720]: I0121 14:34:07.650437 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bqrkw" Jan 21 14:34:08 crc kubenswrapper[4720]: I0121 14:34:07.690958 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bqrkw" Jan 21 14:34:11 crc kubenswrapper[4720]: I0121 14:34:11.517401 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj"] Jan 21 14:34:11 crc kubenswrapper[4720]: I0121 14:34:11.517837 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" podUID="840dfd09-e274-4c2b-9299-a494100e266d" containerName="route-controller-manager" containerID="cri-o://5c99c09d773097bce63e27d370753212cd035db79226b6d7485a1c5439ac7fb3" gracePeriod=30 Jan 21 14:34:11 crc kubenswrapper[4720]: I0121 14:34:11.911673 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" Jan 21 14:34:11 crc kubenswrapper[4720]: I0121 14:34:11.961426 4720 generic.go:334] "Generic (PLEG): container finished" podID="840dfd09-e274-4c2b-9299-a494100e266d" containerID="5c99c09d773097bce63e27d370753212cd035db79226b6d7485a1c5439ac7fb3" exitCode=0 Jan 21 14:34:11 crc kubenswrapper[4720]: I0121 14:34:11.961473 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" event={"ID":"840dfd09-e274-4c2b-9299-a494100e266d","Type":"ContainerDied","Data":"5c99c09d773097bce63e27d370753212cd035db79226b6d7485a1c5439ac7fb3"} Jan 21 14:34:11 crc kubenswrapper[4720]: I0121 14:34:11.961504 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" event={"ID":"840dfd09-e274-4c2b-9299-a494100e266d","Type":"ContainerDied","Data":"a827d68c41cf6bca1d1353db6d4c691cd0bbcd9fa7fef0db59ccff42a67e61f8"} Jan 21 14:34:11 crc kubenswrapper[4720]: I0121 14:34:11.961516 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" Jan 21 14:34:11 crc kubenswrapper[4720]: I0121 14:34:11.961525 4720 scope.go:117] "RemoveContainer" containerID="5c99c09d773097bce63e27d370753212cd035db79226b6d7485a1c5439ac7fb3" Jan 21 14:34:11 crc kubenswrapper[4720]: I0121 14:34:11.979461 4720 scope.go:117] "RemoveContainer" containerID="5c99c09d773097bce63e27d370753212cd035db79226b6d7485a1c5439ac7fb3" Jan 21 14:34:11 crc kubenswrapper[4720]: E0121 14:34:11.980814 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c99c09d773097bce63e27d370753212cd035db79226b6d7485a1c5439ac7fb3\": container with ID starting with 5c99c09d773097bce63e27d370753212cd035db79226b6d7485a1c5439ac7fb3 not found: ID does not exist" containerID="5c99c09d773097bce63e27d370753212cd035db79226b6d7485a1c5439ac7fb3" Jan 21 14:34:11 crc kubenswrapper[4720]: I0121 14:34:11.980853 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c99c09d773097bce63e27d370753212cd035db79226b6d7485a1c5439ac7fb3"} err="failed to get container status \"5c99c09d773097bce63e27d370753212cd035db79226b6d7485a1c5439ac7fb3\": rpc error: code = NotFound desc = could not find container \"5c99c09d773097bce63e27d370753212cd035db79226b6d7485a1c5439ac7fb3\": container with ID starting with 5c99c09d773097bce63e27d370753212cd035db79226b6d7485a1c5439ac7fb3 not found: ID does not exist" Jan 21 14:34:11 crc kubenswrapper[4720]: I0121 14:34:11.992370 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/840dfd09-e274-4c2b-9299-a494100e266d-serving-cert\") pod \"840dfd09-e274-4c2b-9299-a494100e266d\" (UID: \"840dfd09-e274-4c2b-9299-a494100e266d\") " Jan 21 14:34:11 crc kubenswrapper[4720]: I0121 14:34:11.992420 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/840dfd09-e274-4c2b-9299-a494100e266d-config\") pod \"840dfd09-e274-4c2b-9299-a494100e266d\" (UID: \"840dfd09-e274-4c2b-9299-a494100e266d\") " Jan 21 14:34:11 crc kubenswrapper[4720]: I0121 14:34:11.992519 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/840dfd09-e274-4c2b-9299-a494100e266d-client-ca\") pod \"840dfd09-e274-4c2b-9299-a494100e266d\" (UID: \"840dfd09-e274-4c2b-9299-a494100e266d\") " Jan 21 14:34:11 crc kubenswrapper[4720]: I0121 14:34:11.992548 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8cm7\" (UniqueName: \"kubernetes.io/projected/840dfd09-e274-4c2b-9299-a494100e266d-kube-api-access-m8cm7\") pod \"840dfd09-e274-4c2b-9299-a494100e266d\" (UID: \"840dfd09-e274-4c2b-9299-a494100e266d\") " Jan 21 14:34:11 crc kubenswrapper[4720]: I0121 14:34:11.993426 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/840dfd09-e274-4c2b-9299-a494100e266d-client-ca" (OuterVolumeSpecName: "client-ca") pod "840dfd09-e274-4c2b-9299-a494100e266d" (UID: "840dfd09-e274-4c2b-9299-a494100e266d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:34:11 crc kubenswrapper[4720]: I0121 14:34:11.993447 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/840dfd09-e274-4c2b-9299-a494100e266d-config" (OuterVolumeSpecName: "config") pod "840dfd09-e274-4c2b-9299-a494100e266d" (UID: "840dfd09-e274-4c2b-9299-a494100e266d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.002155 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/840dfd09-e274-4c2b-9299-a494100e266d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "840dfd09-e274-4c2b-9299-a494100e266d" (UID: "840dfd09-e274-4c2b-9299-a494100e266d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.013926 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/840dfd09-e274-4c2b-9299-a494100e266d-kube-api-access-m8cm7" (OuterVolumeSpecName: "kube-api-access-m8cm7") pod "840dfd09-e274-4c2b-9299-a494100e266d" (UID: "840dfd09-e274-4c2b-9299-a494100e266d"). InnerVolumeSpecName "kube-api-access-m8cm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.025539 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-hfldm"] Jan 21 14:34:12 crc kubenswrapper[4720]: E0121 14:34:12.025768 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="840dfd09-e274-4c2b-9299-a494100e266d" containerName="route-controller-manager" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.025779 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="840dfd09-e274-4c2b-9299-a494100e266d" containerName="route-controller-manager" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.025870 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="840dfd09-e274-4c2b-9299-a494100e266d" containerName="route-controller-manager" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.026257 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.046188 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-hfldm"] Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.102156 4720 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/840dfd09-e274-4c2b-9299-a494100e266d-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.102401 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8cm7\" (UniqueName: \"kubernetes.io/projected/840dfd09-e274-4c2b-9299-a494100e266d-kube-api-access-m8cm7\") on node \"crc\" DevicePath \"\"" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.102515 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/840dfd09-e274-4c2b-9299-a494100e266d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.102613 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/840dfd09-e274-4c2b-9299-a494100e266d-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.203951 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bcab83dd-6fc2-4f43-b30a-831af267b19d-trusted-ca\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.204023 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bcab83dd-6fc2-4f43-b30a-831af267b19d-bound-sa-token\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.204070 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.204092 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bcab83dd-6fc2-4f43-b30a-831af267b19d-installation-pull-secrets\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.204114 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bcab83dd-6fc2-4f43-b30a-831af267b19d-registry-certificates\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.204130 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq7qv\" (UniqueName: \"kubernetes.io/projected/bcab83dd-6fc2-4f43-b30a-831af267b19d-kube-api-access-zq7qv\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.204158 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bcab83dd-6fc2-4f43-b30a-831af267b19d-ca-trust-extracted\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.204180 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bcab83dd-6fc2-4f43-b30a-831af267b19d-registry-tls\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.222618 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.289170 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj"] Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.293819 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj"] Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.305707 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bcab83dd-6fc2-4f43-b30a-831af267b19d-bound-sa-token\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.305771 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bcab83dd-6fc2-4f43-b30a-831af267b19d-installation-pull-secrets\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.305798 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bcab83dd-6fc2-4f43-b30a-831af267b19d-registry-certificates\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.305812 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq7qv\" (UniqueName: \"kubernetes.io/projected/bcab83dd-6fc2-4f43-b30a-831af267b19d-kube-api-access-zq7qv\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.305840 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bcab83dd-6fc2-4f43-b30a-831af267b19d-ca-trust-extracted\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.305859 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bcab83dd-6fc2-4f43-b30a-831af267b19d-registry-tls\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.305880 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bcab83dd-6fc2-4f43-b30a-831af267b19d-trusted-ca\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.306465 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bcab83dd-6fc2-4f43-b30a-831af267b19d-ca-trust-extracted\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.307026 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bcab83dd-6fc2-4f43-b30a-831af267b19d-trusted-ca\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.307300 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bcab83dd-6fc2-4f43-b30a-831af267b19d-registry-certificates\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.311119 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bcab83dd-6fc2-4f43-b30a-831af267b19d-installation-pull-secrets\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.311159 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bcab83dd-6fc2-4f43-b30a-831af267b19d-registry-tls\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.322710 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq7qv\" (UniqueName: \"kubernetes.io/projected/bcab83dd-6fc2-4f43-b30a-831af267b19d-kube-api-access-zq7qv\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.324848 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bcab83dd-6fc2-4f43-b30a-831af267b19d-bound-sa-token\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.339762 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.514845 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-hfldm"] Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.697156 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="840dfd09-e274-4c2b-9299-a494100e266d" path="/var/lib/kubelet/pods/840dfd09-e274-4c2b-9299-a494100e266d/volumes" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.970864 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" event={"ID":"bcab83dd-6fc2-4f43-b30a-831af267b19d","Type":"ContainerStarted","Data":"244af74cca6b260cfcf7f641b2db789267c3ba98bc956691a29a8cb874b361bc"} Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.970936 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" event={"ID":"bcab83dd-6fc2-4f43-b30a-831af267b19d","Type":"ContainerStarted","Data":"97a3d53efb195f99cd15100ff87f06556bb4f4d3f7a4a5ad373d106e45f6e42e"} Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.970989 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.994815 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" podStartSLOduration=0.994795075 podStartE2EDuration="994.795075ms" podCreationTimestamp="2026-01-21 14:34:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:34:12.99112101 +0000 UTC m=+290.899860962" watchObservedRunningTime="2026-01-21 14:34:12.994795075 +0000 UTC m=+290.903535007" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.211235 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks"] Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.212006 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.214788 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.215722 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.215738 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.215807 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.216414 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.219301 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.224424 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks"] Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.319284 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/724953e6-eb48-401a-b5fd-fb565448db70-config\") pod \"route-controller-manager-6cbf946f8c-b9rks\" (UID: \"724953e6-eb48-401a-b5fd-fb565448db70\") " pod="openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.319367 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/724953e6-eb48-401a-b5fd-fb565448db70-serving-cert\") pod \"route-controller-manager-6cbf946f8c-b9rks\" (UID: \"724953e6-eb48-401a-b5fd-fb565448db70\") " pod="openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.319442 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkk44\" (UniqueName: \"kubernetes.io/projected/724953e6-eb48-401a-b5fd-fb565448db70-kube-api-access-dkk44\") pod \"route-controller-manager-6cbf946f8c-b9rks\" (UID: \"724953e6-eb48-401a-b5fd-fb565448db70\") " pod="openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.319554 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/724953e6-eb48-401a-b5fd-fb565448db70-client-ca\") pod \"route-controller-manager-6cbf946f8c-b9rks\" (UID: \"724953e6-eb48-401a-b5fd-fb565448db70\") " pod="openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.420369 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/724953e6-eb48-401a-b5fd-fb565448db70-client-ca\") pod \"route-controller-manager-6cbf946f8c-b9rks\" (UID: \"724953e6-eb48-401a-b5fd-fb565448db70\") " pod="openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.420425 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/724953e6-eb48-401a-b5fd-fb565448db70-config\") pod \"route-controller-manager-6cbf946f8c-b9rks\" (UID: \"724953e6-eb48-401a-b5fd-fb565448db70\") " pod="openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.420458 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/724953e6-eb48-401a-b5fd-fb565448db70-serving-cert\") pod \"route-controller-manager-6cbf946f8c-b9rks\" (UID: \"724953e6-eb48-401a-b5fd-fb565448db70\") " pod="openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.420506 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkk44\" (UniqueName: \"kubernetes.io/projected/724953e6-eb48-401a-b5fd-fb565448db70-kube-api-access-dkk44\") pod \"route-controller-manager-6cbf946f8c-b9rks\" (UID: \"724953e6-eb48-401a-b5fd-fb565448db70\") " pod="openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.421416 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/724953e6-eb48-401a-b5fd-fb565448db70-client-ca\") pod \"route-controller-manager-6cbf946f8c-b9rks\" (UID: \"724953e6-eb48-401a-b5fd-fb565448db70\") " pod="openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.422023 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/724953e6-eb48-401a-b5fd-fb565448db70-config\") pod \"route-controller-manager-6cbf946f8c-b9rks\" (UID: \"724953e6-eb48-401a-b5fd-fb565448db70\") " pod="openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.426267 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/724953e6-eb48-401a-b5fd-fb565448db70-serving-cert\") pod \"route-controller-manager-6cbf946f8c-b9rks\" (UID: \"724953e6-eb48-401a-b5fd-fb565448db70\") " pod="openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.436225 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkk44\" (UniqueName: \"kubernetes.io/projected/724953e6-eb48-401a-b5fd-fb565448db70-kube-api-access-dkk44\") pod \"route-controller-manager-6cbf946f8c-b9rks\" (UID: \"724953e6-eb48-401a-b5fd-fb565448db70\") " pod="openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.565474 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.966450 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks"] Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.980473 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks" event={"ID":"724953e6-eb48-401a-b5fd-fb565448db70","Type":"ContainerStarted","Data":"bfd9635fee2a0d38ae67c9bdcfdf17fe7f7c524b1c3046f92e7ebe8fc2ae4624"} Jan 21 14:34:18 crc kubenswrapper[4720]: I0121 14:34:18.003419 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks" event={"ID":"724953e6-eb48-401a-b5fd-fb565448db70","Type":"ContainerStarted","Data":"c9b90dc4356cdfbe8ed5c625d916169f6fe0794c10c8acead03e0d635fe33f0a"} Jan 21 14:34:18 crc kubenswrapper[4720]: I0121 14:34:18.004819 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks" Jan 21 14:34:18 crc kubenswrapper[4720]: I0121 14:34:18.011607 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks" Jan 21 14:34:18 crc kubenswrapper[4720]: I0121 14:34:18.021153 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks" podStartSLOduration=7.021134491 podStartE2EDuration="7.021134491s" podCreationTimestamp="2026-01-21 14:34:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:34:18.018726532 +0000 UTC m=+295.927466494" watchObservedRunningTime="2026-01-21 14:34:18.021134491 +0000 UTC m=+295.929874423" Jan 21 14:34:22 crc kubenswrapper[4720]: I0121 14:34:22.230103 4720 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Jan 21 14:34:32 crc kubenswrapper[4720]: I0121 14:34:32.351304 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:32 crc kubenswrapper[4720]: I0121 14:34:32.427560 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6kjwf"] Jan 21 14:34:57 crc kubenswrapper[4720]: I0121 14:34:57.485512 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" podUID="ccf13312-4caa-4898-9dd3-3f9614ecee01" containerName="registry" containerID="cri-o://e3c917729cbed0b95bf83042f4024bb09e5fcf08063dcc9274062f7754ff9a3d" gracePeriod=30 Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.221522 4720 generic.go:334] "Generic (PLEG): container finished" podID="ccf13312-4caa-4898-9dd3-3f9614ecee01" containerID="e3c917729cbed0b95bf83042f4024bb09e5fcf08063dcc9274062f7754ff9a3d" exitCode=0 Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.221576 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" event={"ID":"ccf13312-4caa-4898-9dd3-3f9614ecee01","Type":"ContainerDied","Data":"e3c917729cbed0b95bf83042f4024bb09e5fcf08063dcc9274062f7754ff9a3d"} Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.401339 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.423959 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ccf13312-4caa-4898-9dd3-3f9614ecee01-registry-certificates\") pod \"ccf13312-4caa-4898-9dd3-3f9614ecee01\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.424018 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ccf13312-4caa-4898-9dd3-3f9614ecee01-registry-tls\") pod \"ccf13312-4caa-4898-9dd3-3f9614ecee01\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.424190 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"ccf13312-4caa-4898-9dd3-3f9614ecee01\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.424270 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ccf13312-4caa-4898-9dd3-3f9614ecee01-trusted-ca\") pod \"ccf13312-4caa-4898-9dd3-3f9614ecee01\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.424340 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ccf13312-4caa-4898-9dd3-3f9614ecee01-installation-pull-secrets\") pod \"ccf13312-4caa-4898-9dd3-3f9614ecee01\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.424700 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ccf13312-4caa-4898-9dd3-3f9614ecee01-ca-trust-extracted\") pod \"ccf13312-4caa-4898-9dd3-3f9614ecee01\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.424730 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jcfd\" (UniqueName: \"kubernetes.io/projected/ccf13312-4caa-4898-9dd3-3f9614ecee01-kube-api-access-7jcfd\") pod \"ccf13312-4caa-4898-9dd3-3f9614ecee01\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.424787 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ccf13312-4caa-4898-9dd3-3f9614ecee01-bound-sa-token\") pod \"ccf13312-4caa-4898-9dd3-3f9614ecee01\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.427287 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccf13312-4caa-4898-9dd3-3f9614ecee01-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "ccf13312-4caa-4898-9dd3-3f9614ecee01" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.427507 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccf13312-4caa-4898-9dd3-3f9614ecee01-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "ccf13312-4caa-4898-9dd3-3f9614ecee01" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.432162 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccf13312-4caa-4898-9dd3-3f9614ecee01-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "ccf13312-4caa-4898-9dd3-3f9614ecee01" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.433048 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccf13312-4caa-4898-9dd3-3f9614ecee01-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "ccf13312-4caa-4898-9dd3-3f9614ecee01" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.440936 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccf13312-4caa-4898-9dd3-3f9614ecee01-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "ccf13312-4caa-4898-9dd3-3f9614ecee01" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.443509 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "ccf13312-4caa-4898-9dd3-3f9614ecee01" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.447677 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccf13312-4caa-4898-9dd3-3f9614ecee01-kube-api-access-7jcfd" (OuterVolumeSpecName: "kube-api-access-7jcfd") pod "ccf13312-4caa-4898-9dd3-3f9614ecee01" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01"). InnerVolumeSpecName "kube-api-access-7jcfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.450716 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccf13312-4caa-4898-9dd3-3f9614ecee01-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "ccf13312-4caa-4898-9dd3-3f9614ecee01" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.526078 4720 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ccf13312-4caa-4898-9dd3-3f9614ecee01-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.526115 4720 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ccf13312-4caa-4898-9dd3-3f9614ecee01-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.526124 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jcfd\" (UniqueName: \"kubernetes.io/projected/ccf13312-4caa-4898-9dd3-3f9614ecee01-kube-api-access-7jcfd\") on node \"crc\" DevicePath \"\"" Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.526135 4720 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ccf13312-4caa-4898-9dd3-3f9614ecee01-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.526144 4720 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ccf13312-4caa-4898-9dd3-3f9614ecee01-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.526180 4720 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ccf13312-4caa-4898-9dd3-3f9614ecee01-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.526189 4720 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ccf13312-4caa-4898-9dd3-3f9614ecee01-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:34:59 crc kubenswrapper[4720]: I0121 14:34:59.228688 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" event={"ID":"ccf13312-4caa-4898-9dd3-3f9614ecee01","Type":"ContainerDied","Data":"cc78447803378e22f6cbae3e9270bdc6d0ee1630fceb9cd43ec6c839a71ce985"} Jan 21 14:34:59 crc kubenswrapper[4720]: I0121 14:34:59.228778 4720 scope.go:117] "RemoveContainer" containerID="e3c917729cbed0b95bf83042f4024bb09e5fcf08063dcc9274062f7754ff9a3d" Jan 21 14:34:59 crc kubenswrapper[4720]: I0121 14:34:59.228770 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:34:59 crc kubenswrapper[4720]: I0121 14:34:59.247811 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6kjwf"] Jan 21 14:34:59 crc kubenswrapper[4720]: I0121 14:34:59.254476 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6kjwf"] Jan 21 14:35:00 crc kubenswrapper[4720]: I0121 14:35:00.685155 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccf13312-4caa-4898-9dd3-3f9614ecee01" path="/var/lib/kubelet/pods/ccf13312-4caa-4898-9dd3-3f9614ecee01/volumes" Jan 21 14:35:22 crc kubenswrapper[4720]: I0121 14:35:22.880226 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:35:22 crc kubenswrapper[4720]: I0121 14:35:22.880916 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:35:52 crc kubenswrapper[4720]: I0121 14:35:52.880575 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:35:52 crc kubenswrapper[4720]: I0121 14:35:52.881172 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:36:22 crc kubenswrapper[4720]: I0121 14:36:22.880219 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:36:22 crc kubenswrapper[4720]: I0121 14:36:22.880942 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:36:22 crc kubenswrapper[4720]: I0121 14:36:22.881004 4720 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:36:22 crc kubenswrapper[4720]: I0121 14:36:22.881756 4720 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"eab7230c9b1780824322550642987ab8759942bce4be148af7dcc4a247edffb1"} pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 14:36:22 crc kubenswrapper[4720]: I0121 14:36:22.881842 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" containerID="cri-o://eab7230c9b1780824322550642987ab8759942bce4be148af7dcc4a247edffb1" gracePeriod=600 Jan 21 14:36:23 crc kubenswrapper[4720]: I0121 14:36:23.682065 4720 generic.go:334] "Generic (PLEG): container finished" podID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerID="eab7230c9b1780824322550642987ab8759942bce4be148af7dcc4a247edffb1" exitCode=0 Jan 21 14:36:23 crc kubenswrapper[4720]: I0121 14:36:23.682106 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerDied","Data":"eab7230c9b1780824322550642987ab8759942bce4be148af7dcc4a247edffb1"} Jan 21 14:36:23 crc kubenswrapper[4720]: I0121 14:36:23.682136 4720 scope.go:117] "RemoveContainer" containerID="926a9b75c9fc74a93dd69c62eb765f3cdb4aeaf1bc918f7c3dc8f79011404240" Jan 21 14:36:24 crc kubenswrapper[4720]: I0121 14:36:24.688027 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerStarted","Data":"a61755d2d50927cd3c032bcad351e940f76beb15aa30f49c45cc8f2e261c405c"} Jan 21 14:36:56 crc kubenswrapper[4720]: I0121 14:36:56.808922 4720 scope.go:117] "RemoveContainer" containerID="14e886daf1a3a6b869ffcf74d313a6df0c2abaf901b1048767f8b1caf48b8b35" Jan 21 14:37:56 crc kubenswrapper[4720]: I0121 14:37:56.841385 4720 scope.go:117] "RemoveContainer" containerID="828c55378e558356171a9771b0f3cab050cb198f63a03e622439dc4e677f234d" Jan 21 14:37:56 crc kubenswrapper[4720]: I0121 14:37:56.871676 4720 scope.go:117] "RemoveContainer" containerID="a7b59502380a4895dd54770bd6aaf04fdda2243417fd15217dfcddfff65770ae" Jan 21 14:38:52 crc kubenswrapper[4720]: I0121 14:38:52.880601 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:38:52 crc kubenswrapper[4720]: I0121 14:38:52.881365 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:39:22 crc kubenswrapper[4720]: I0121 14:39:22.880251 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:39:22 crc kubenswrapper[4720]: I0121 14:39:22.880899 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.122553 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-c4tn5"] Jan 21 14:39:34 crc kubenswrapper[4720]: E0121 14:39:34.123444 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccf13312-4caa-4898-9dd3-3f9614ecee01" containerName="registry" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.123462 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccf13312-4caa-4898-9dd3-3f9614ecee01" containerName="registry" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.123590 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccf13312-4caa-4898-9dd3-3f9614ecee01" containerName="registry" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.124295 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-c4tn5" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.126731 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.126938 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.127179 4720 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-k9tzj" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.137071 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-c4tn5"] Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.150756 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-d6jp2"] Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.151528 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-d6jp2" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.154475 4720 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-hvqqj" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.179511 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-d6jp2"] Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.184104 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-vflwv"] Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.184876 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-vflwv" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.187891 4720 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-rr7l9" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.199682 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-vflwv"] Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.240835 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p9sc\" (UniqueName: \"kubernetes.io/projected/4939bfdd-b3b4-4850-8b5d-3399548ad5a0-kube-api-access-9p9sc\") pod \"cert-manager-cainjector-cf98fcc89-c4tn5\" (UID: \"4939bfdd-b3b4-4850-8b5d-3399548ad5a0\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-c4tn5" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.240886 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b69np\" (UniqueName: \"kubernetes.io/projected/4eec0898-8a1a-47d9-ac37-62cfe6c7b857-kube-api-access-b69np\") pod \"cert-manager-858654f9db-d6jp2\" (UID: \"4eec0898-8a1a-47d9-ac37-62cfe6c7b857\") " pod="cert-manager/cert-manager-858654f9db-d6jp2" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.240979 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp6f6\" (UniqueName: \"kubernetes.io/projected/0236eaa4-e5d8-4699-82f8-1e9648f95dc8-kube-api-access-kp6f6\") pod \"cert-manager-webhook-687f57d79b-vflwv\" (UID: \"0236eaa4-e5d8-4699-82f8-1e9648f95dc8\") " pod="cert-manager/cert-manager-webhook-687f57d79b-vflwv" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.341715 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p9sc\" (UniqueName: \"kubernetes.io/projected/4939bfdd-b3b4-4850-8b5d-3399548ad5a0-kube-api-access-9p9sc\") pod \"cert-manager-cainjector-cf98fcc89-c4tn5\" (UID: \"4939bfdd-b3b4-4850-8b5d-3399548ad5a0\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-c4tn5" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.341767 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b69np\" (UniqueName: \"kubernetes.io/projected/4eec0898-8a1a-47d9-ac37-62cfe6c7b857-kube-api-access-b69np\") pod \"cert-manager-858654f9db-d6jp2\" (UID: \"4eec0898-8a1a-47d9-ac37-62cfe6c7b857\") " pod="cert-manager/cert-manager-858654f9db-d6jp2" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.341822 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp6f6\" (UniqueName: \"kubernetes.io/projected/0236eaa4-e5d8-4699-82f8-1e9648f95dc8-kube-api-access-kp6f6\") pod \"cert-manager-webhook-687f57d79b-vflwv\" (UID: \"0236eaa4-e5d8-4699-82f8-1e9648f95dc8\") " pod="cert-manager/cert-manager-webhook-687f57d79b-vflwv" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.362869 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b69np\" (UniqueName: \"kubernetes.io/projected/4eec0898-8a1a-47d9-ac37-62cfe6c7b857-kube-api-access-b69np\") pod \"cert-manager-858654f9db-d6jp2\" (UID: \"4eec0898-8a1a-47d9-ac37-62cfe6c7b857\") " pod="cert-manager/cert-manager-858654f9db-d6jp2" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.363710 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp6f6\" (UniqueName: \"kubernetes.io/projected/0236eaa4-e5d8-4699-82f8-1e9648f95dc8-kube-api-access-kp6f6\") pod \"cert-manager-webhook-687f57d79b-vflwv\" (UID: \"0236eaa4-e5d8-4699-82f8-1e9648f95dc8\") " pod="cert-manager/cert-manager-webhook-687f57d79b-vflwv" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.365607 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p9sc\" (UniqueName: \"kubernetes.io/projected/4939bfdd-b3b4-4850-8b5d-3399548ad5a0-kube-api-access-9p9sc\") pod \"cert-manager-cainjector-cf98fcc89-c4tn5\" (UID: \"4939bfdd-b3b4-4850-8b5d-3399548ad5a0\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-c4tn5" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.440920 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-c4tn5" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.466109 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-d6jp2" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.500227 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-vflwv" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.784991 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-c4tn5"] Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.802160 4720 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.836246 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-d6jp2"] Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.864945 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-vflwv"] Jan 21 14:39:34 crc kubenswrapper[4720]: W0121 14:39:34.866289 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0236eaa4_e5d8_4699_82f8_1e9648f95dc8.slice/crio-222b99aa1177b46333389096afaf13753b0a749d17617b49b55146ced6373a4d WatchSource:0}: Error finding container 222b99aa1177b46333389096afaf13753b0a749d17617b49b55146ced6373a4d: Status 404 returned error can't find the container with id 222b99aa1177b46333389096afaf13753b0a749d17617b49b55146ced6373a4d Jan 21 14:39:35 crc kubenswrapper[4720]: I0121 14:39:35.784048 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-d6jp2" event={"ID":"4eec0898-8a1a-47d9-ac37-62cfe6c7b857","Type":"ContainerStarted","Data":"91480a298275598c96f67adb1602c43e4fccf021c122b5c3fdaaf9be02d132cf"} Jan 21 14:39:35 crc kubenswrapper[4720]: I0121 14:39:35.787093 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-vflwv" event={"ID":"0236eaa4-e5d8-4699-82f8-1e9648f95dc8","Type":"ContainerStarted","Data":"222b99aa1177b46333389096afaf13753b0a749d17617b49b55146ced6373a4d"} Jan 21 14:39:35 crc kubenswrapper[4720]: I0121 14:39:35.789562 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-c4tn5" event={"ID":"4939bfdd-b3b4-4850-8b5d-3399548ad5a0","Type":"ContainerStarted","Data":"b430360181832cde5ba7b8ff85d38fc1cde96fbe9868850fb1d3a474e26a3a3c"} Jan 21 14:39:37 crc kubenswrapper[4720]: I0121 14:39:37.801259 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-c4tn5" event={"ID":"4939bfdd-b3b4-4850-8b5d-3399548ad5a0","Type":"ContainerStarted","Data":"600a6ace4d38f05f38eb69d88c28bdcc5d2daea310b75beec57f95c5e3e43dc0"} Jan 21 14:39:37 crc kubenswrapper[4720]: I0121 14:39:37.826227 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-c4tn5" podStartSLOduration=1.582203775 podStartE2EDuration="3.82620782s" podCreationTimestamp="2026-01-21 14:39:34 +0000 UTC" firstStartedPulling="2026-01-21 14:39:34.80189582 +0000 UTC m=+612.710635752" lastFinishedPulling="2026-01-21 14:39:37.045899865 +0000 UTC m=+614.954639797" observedRunningTime="2026-01-21 14:39:37.820087489 +0000 UTC m=+615.728827431" watchObservedRunningTime="2026-01-21 14:39:37.82620782 +0000 UTC m=+615.734947772" Jan 21 14:39:38 crc kubenswrapper[4720]: I0121 14:39:38.807305 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-vflwv" event={"ID":"0236eaa4-e5d8-4699-82f8-1e9648f95dc8","Type":"ContainerStarted","Data":"f53b18ac17718378a8c351acf908be0f328901f8b9cb647748741bd3372d412a"} Jan 21 14:39:39 crc kubenswrapper[4720]: I0121 14:39:39.813550 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-d6jp2" event={"ID":"4eec0898-8a1a-47d9-ac37-62cfe6c7b857","Type":"ContainerStarted","Data":"245195f570219ab9a529c7c627643788975777c677bc8e0e705a3f27df79e779"} Jan 21 14:39:39 crc kubenswrapper[4720]: I0121 14:39:39.814226 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-vflwv" Jan 21 14:39:39 crc kubenswrapper[4720]: I0121 14:39:39.833542 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-vflwv" podStartSLOduration=2.225109085 podStartE2EDuration="5.833522222s" podCreationTimestamp="2026-01-21 14:39:34 +0000 UTC" firstStartedPulling="2026-01-21 14:39:34.868917494 +0000 UTC m=+612.777657426" lastFinishedPulling="2026-01-21 14:39:38.477330631 +0000 UTC m=+616.386070563" observedRunningTime="2026-01-21 14:39:39.830124508 +0000 UTC m=+617.738864450" watchObservedRunningTime="2026-01-21 14:39:39.833522222 +0000 UTC m=+617.742262154" Jan 21 14:39:39 crc kubenswrapper[4720]: I0121 14:39:39.846401 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-d6jp2" podStartSLOduration=2.154915713 podStartE2EDuration="5.846386241s" podCreationTimestamp="2026-01-21 14:39:34 +0000 UTC" firstStartedPulling="2026-01-21 14:39:34.844644848 +0000 UTC m=+612.753384770" lastFinishedPulling="2026-01-21 14:39:38.536115366 +0000 UTC m=+616.444855298" observedRunningTime="2026-01-21 14:39:39.845055524 +0000 UTC m=+617.753795456" watchObservedRunningTime="2026-01-21 14:39:39.846386241 +0000 UTC m=+617.755126173" Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.474647 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zr5bd"] Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.478100 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="nbdb" containerID="cri-o://625271ad4ff015cec75e604df3cd6c896dadaa55c6eea3a035e02ffaace285cc" gracePeriod=30 Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.478306 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="northd" containerID="cri-o://d1e39c6489267605567083bbf58d52dc83783ab46cf6468a4a619fdd22a893c0" gracePeriod=30 Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.478492 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://3ec88cc895f2b82eea19c4a927605f7196f008760619664f069a2d889cbe5c5a" gracePeriod=30 Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.478697 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="kube-rbac-proxy-node" containerID="cri-o://4056ab2c599299e02ae3a697f4141c19f2ef8508967564b6eb65b03665cc2136" gracePeriod=30 Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.478865 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="ovn-acl-logging" containerID="cri-o://259075e5d56293bfdd3c160453e38651da3e67e117efc2e3c9012ac4918c0bf9" gracePeriod=30 Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.477574 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="ovn-controller" containerID="cri-o://aae9671a8f7e4c811bae63647d082185a7a92f1832f12a8387956c7bb6aab556" gracePeriod=30 Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.479245 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="sbdb" containerID="cri-o://cc403e7370f4c0e180578f7c937d1c540b58b911d0dcde263edb0332846542ff" gracePeriod=30 Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.509414 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="ovnkube-controller" containerID="cri-o://b87e56ea354fc6a8d23bff77c9d817ab2f0481a52e7536b33cff8e842d3acc39" gracePeriod=30 Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.834132 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-w85dm_a40805c6-ef8a-4ae0-bb5b-1834d257e8c6/kube-multus/0.log" Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.834177 4720 generic.go:334] "Generic (PLEG): container finished" podID="a40805c6-ef8a-4ae0-bb5b-1834d257e8c6" containerID="3df2e65ca3b78094d1f1a647b130e272d7eff6699626e3dace56d3c8488f9d61" exitCode=2 Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.834224 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-w85dm" event={"ID":"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6","Type":"ContainerDied","Data":"3df2e65ca3b78094d1f1a647b130e272d7eff6699626e3dace56d3c8488f9d61"} Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.834638 4720 scope.go:117] "RemoveContainer" containerID="3df2e65ca3b78094d1f1a647b130e272d7eff6699626e3dace56d3c8488f9d61" Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.842204 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zr5bd_ac61c15b-6fe9-4c83-9ca7-588095ab1a29/ovn-acl-logging/0.log" Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.842715 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zr5bd_ac61c15b-6fe9-4c83-9ca7-588095ab1a29/ovn-controller/0.log" Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.843186 4720 generic.go:334] "Generic (PLEG): container finished" podID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerID="b87e56ea354fc6a8d23bff77c9d817ab2f0481a52e7536b33cff8e842d3acc39" exitCode=0 Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.843219 4720 generic.go:334] "Generic (PLEG): container finished" podID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerID="cc403e7370f4c0e180578f7c937d1c540b58b911d0dcde263edb0332846542ff" exitCode=0 Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.843229 4720 generic.go:334] "Generic (PLEG): container finished" podID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerID="3ec88cc895f2b82eea19c4a927605f7196f008760619664f069a2d889cbe5c5a" exitCode=0 Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.843238 4720 generic.go:334] "Generic (PLEG): container finished" podID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerID="4056ab2c599299e02ae3a697f4141c19f2ef8508967564b6eb65b03665cc2136" exitCode=0 Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.843244 4720 generic.go:334] "Generic (PLEG): container finished" podID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerID="259075e5d56293bfdd3c160453e38651da3e67e117efc2e3c9012ac4918c0bf9" exitCode=143 Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.843250 4720 generic.go:334] "Generic (PLEG): container finished" podID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerID="aae9671a8f7e4c811bae63647d082185a7a92f1832f12a8387956c7bb6aab556" exitCode=143 Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.843270 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" event={"ID":"ac61c15b-6fe9-4c83-9ca7-588095ab1a29","Type":"ContainerDied","Data":"b87e56ea354fc6a8d23bff77c9d817ab2f0481a52e7536b33cff8e842d3acc39"} Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.843293 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" event={"ID":"ac61c15b-6fe9-4c83-9ca7-588095ab1a29","Type":"ContainerDied","Data":"cc403e7370f4c0e180578f7c937d1c540b58b911d0dcde263edb0332846542ff"} Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.843318 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" event={"ID":"ac61c15b-6fe9-4c83-9ca7-588095ab1a29","Type":"ContainerDied","Data":"3ec88cc895f2b82eea19c4a927605f7196f008760619664f069a2d889cbe5c5a"} Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.843331 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" event={"ID":"ac61c15b-6fe9-4c83-9ca7-588095ab1a29","Type":"ContainerDied","Data":"4056ab2c599299e02ae3a697f4141c19f2ef8508967564b6eb65b03665cc2136"} Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.843340 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" event={"ID":"ac61c15b-6fe9-4c83-9ca7-588095ab1a29","Type":"ContainerDied","Data":"259075e5d56293bfdd3c160453e38651da3e67e117efc2e3c9012ac4918c0bf9"} Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.843348 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" event={"ID":"ac61c15b-6fe9-4c83-9ca7-588095ab1a29","Type":"ContainerDied","Data":"aae9671a8f7e4c811bae63647d082185a7a92f1832f12a8387956c7bb6aab556"} Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.143601 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zr5bd_ac61c15b-6fe9-4c83-9ca7-588095ab1a29/ovn-acl-logging/0.log" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.144408 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zr5bd_ac61c15b-6fe9-4c83-9ca7-588095ab1a29/ovn-controller/0.log" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.144830 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.199443 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-w2jn6"] Jan 21 14:39:44 crc kubenswrapper[4720]: E0121 14:39:44.199622 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="sbdb" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.199634 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="sbdb" Jan 21 14:39:44 crc kubenswrapper[4720]: E0121 14:39:44.199644 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="ovn-controller" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.199650 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="ovn-controller" Jan 21 14:39:44 crc kubenswrapper[4720]: E0121 14:39:44.199677 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="northd" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.199682 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="northd" Jan 21 14:39:44 crc kubenswrapper[4720]: E0121 14:39:44.199691 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="ovnkube-controller" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.199696 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="ovnkube-controller" Jan 21 14:39:44 crc kubenswrapper[4720]: E0121 14:39:44.199707 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="ovn-acl-logging" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.199712 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="ovn-acl-logging" Jan 21 14:39:44 crc kubenswrapper[4720]: E0121 14:39:44.199721 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="nbdb" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.199726 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="nbdb" Jan 21 14:39:44 crc kubenswrapper[4720]: E0121 14:39:44.199735 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="kubecfg-setup" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.199741 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="kubecfg-setup" Jan 21 14:39:44 crc kubenswrapper[4720]: E0121 14:39:44.199749 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="kube-rbac-proxy-node" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.199755 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="kube-rbac-proxy-node" Jan 21 14:39:44 crc kubenswrapper[4720]: E0121 14:39:44.199765 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="kube-rbac-proxy-ovn-metrics" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.199771 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="kube-rbac-proxy-ovn-metrics" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.199865 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="ovn-acl-logging" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.199878 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="kube-rbac-proxy-node" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.199888 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="nbdb" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.199897 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="kube-rbac-proxy-ovn-metrics" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.199906 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="ovnkube-controller" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.199914 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="sbdb" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.199922 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="ovn-controller" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.199930 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="northd" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.201702 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220039 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-node-log\") pod \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220086 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-var-lib-openvswitch\") pod \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220104 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-cni-netd\") pod \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220122 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-log-socket\") pod \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220150 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-env-overrides\") pod \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220169 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-ovn-node-metrics-cert\") pod \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220189 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-run-openvswitch\") pod \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220201 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220216 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-systemd-units\") pod \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220253 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-run-ovn-kubernetes\") pod \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220266 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-etc-openvswitch\") pod \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220281 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-ovnkube-config\") pod \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220304 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-slash\") pod \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220333 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-ovnkube-script-lib\") pod \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220357 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvf2r\" (UniqueName: \"kubernetes.io/projected/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-kube-api-access-kvf2r\") pod \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220379 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-run-netns\") pod \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220407 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-cni-bin\") pod \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220424 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-run-systemd\") pod \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220442 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-kubelet\") pod \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220462 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-run-ovn\") pod \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220530 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "ac61c15b-6fe9-4c83-9ca7-588095ab1a29" (UID: "ac61c15b-6fe9-4c83-9ca7-588095ab1a29"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220547 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "ac61c15b-6fe9-4c83-9ca7-588095ab1a29" (UID: "ac61c15b-6fe9-4c83-9ca7-588095ab1a29"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220555 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "ac61c15b-6fe9-4c83-9ca7-588095ab1a29" (UID: "ac61c15b-6fe9-4c83-9ca7-588095ab1a29"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220584 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "ac61c15b-6fe9-4c83-9ca7-588095ab1a29" (UID: "ac61c15b-6fe9-4c83-9ca7-588095ab1a29"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220601 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-node-log" (OuterVolumeSpecName: "node-log") pod "ac61c15b-6fe9-4c83-9ca7-588095ab1a29" (UID: "ac61c15b-6fe9-4c83-9ca7-588095ab1a29"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220608 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "ac61c15b-6fe9-4c83-9ca7-588095ab1a29" (UID: "ac61c15b-6fe9-4c83-9ca7-588095ab1a29"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220626 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-slash" (OuterVolumeSpecName: "host-slash") pod "ac61c15b-6fe9-4c83-9ca7-588095ab1a29" (UID: "ac61c15b-6fe9-4c83-9ca7-588095ab1a29"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220647 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "ac61c15b-6fe9-4c83-9ca7-588095ab1a29" (UID: "ac61c15b-6fe9-4c83-9ca7-588095ab1a29"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220689 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "ac61c15b-6fe9-4c83-9ca7-588095ab1a29" (UID: "ac61c15b-6fe9-4c83-9ca7-588095ab1a29"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220887 4720 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-kubelet\") on node \"crc\" DevicePath \"\"" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220903 4720 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220914 4720 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-node-log\") on node \"crc\" DevicePath \"\"" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220925 4720 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220936 4720 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-cni-netd\") on node \"crc\" DevicePath \"\"" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220946 4720 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-systemd-units\") on node \"crc\" DevicePath \"\"" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220959 4720 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220971 4720 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-slash\") on node \"crc\" DevicePath \"\"" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220983 4720 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-cni-bin\") on node \"crc\" DevicePath \"\"" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220943 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "ac61c15b-6fe9-4c83-9ca7-588095ab1a29" (UID: "ac61c15b-6fe9-4c83-9ca7-588095ab1a29"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220962 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "ac61c15b-6fe9-4c83-9ca7-588095ab1a29" (UID: "ac61c15b-6fe9-4c83-9ca7-588095ab1a29"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.221005 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "ac61c15b-6fe9-4c83-9ca7-588095ab1a29" (UID: "ac61c15b-6fe9-4c83-9ca7-588095ab1a29"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220966 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "ac61c15b-6fe9-4c83-9ca7-588095ab1a29" (UID: "ac61c15b-6fe9-4c83-9ca7-588095ab1a29"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220986 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "ac61c15b-6fe9-4c83-9ca7-588095ab1a29" (UID: "ac61c15b-6fe9-4c83-9ca7-588095ab1a29"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.221056 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "ac61c15b-6fe9-4c83-9ca7-588095ab1a29" (UID: "ac61c15b-6fe9-4c83-9ca7-588095ab1a29"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.221112 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "ac61c15b-6fe9-4c83-9ca7-588095ab1a29" (UID: "ac61c15b-6fe9-4c83-9ca7-588095ab1a29"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.221046 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-log-socket" (OuterVolumeSpecName: "log-socket") pod "ac61c15b-6fe9-4c83-9ca7-588095ab1a29" (UID: "ac61c15b-6fe9-4c83-9ca7-588095ab1a29"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.237775 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "ac61c15b-6fe9-4c83-9ca7-588095ab1a29" (UID: "ac61c15b-6fe9-4c83-9ca7-588095ab1a29"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.238508 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-kube-api-access-kvf2r" (OuterVolumeSpecName: "kube-api-access-kvf2r") pod "ac61c15b-6fe9-4c83-9ca7-588095ab1a29" (UID: "ac61c15b-6fe9-4c83-9ca7-588095ab1a29"). InnerVolumeSpecName "kube-api-access-kvf2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.239867 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "ac61c15b-6fe9-4c83-9ca7-588095ab1a29" (UID: "ac61c15b-6fe9-4c83-9ca7-588095ab1a29"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.322464 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-log-socket\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.322530 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-kubelet\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.322552 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-cni-netd\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.322607 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-var-lib-openvswitch\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.322625 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-cni-bin\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.322690 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-systemd-units\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.322707 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/60550f21-b0dd-410b-a4be-cba72e8b7b71-ovnkube-config\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.322722 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-run-netns\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.322736 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/60550f21-b0dd-410b-a4be-cba72e8b7b71-ovnkube-script-lib\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.322758 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-run-systemd\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.322775 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/60550f21-b0dd-410b-a4be-cba72e8b7b71-ovn-node-metrics-cert\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.322853 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.323015 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-run-ovn\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.323044 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-node-log\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.323067 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-run-openvswitch\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.323095 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jxb8\" (UniqueName: \"kubernetes.io/projected/60550f21-b0dd-410b-a4be-cba72e8b7b71-kube-api-access-6jxb8\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.323129 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/60550f21-b0dd-410b-a4be-cba72e8b7b71-env-overrides\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.323158 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-run-ovn-kubernetes\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.323182 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-etc-openvswitch\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.323215 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-slash\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.323312 4720 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-run-netns\") on node \"crc\" DevicePath \"\"" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.323328 4720 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-run-systemd\") on node \"crc\" DevicePath \"\"" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.323341 4720 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-log-socket\") on node \"crc\" DevicePath \"\"" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.323353 4720 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.323364 4720 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.323375 4720 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-run-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.323389 4720 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.323402 4720 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.323413 4720 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.323424 4720 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.323436 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvf2r\" (UniqueName: \"kubernetes.io/projected/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-kube-api-access-kvf2r\") on node \"crc\" DevicePath \"\"" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424330 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-systemd-units\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424370 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/60550f21-b0dd-410b-a4be-cba72e8b7b71-ovnkube-config\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424388 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-run-netns\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424405 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/60550f21-b0dd-410b-a4be-cba72e8b7b71-ovnkube-script-lib\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424436 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-run-systemd\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424452 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/60550f21-b0dd-410b-a4be-cba72e8b7b71-ovn-node-metrics-cert\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424470 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424501 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-run-ovn\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424515 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-node-log\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424528 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-run-openvswitch\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424544 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jxb8\" (UniqueName: \"kubernetes.io/projected/60550f21-b0dd-410b-a4be-cba72e8b7b71-kube-api-access-6jxb8\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424562 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/60550f21-b0dd-410b-a4be-cba72e8b7b71-env-overrides\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424578 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-run-ovn-kubernetes\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424592 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-etc-openvswitch\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424608 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-slash\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424624 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-log-socket\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424642 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-kubelet\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424668 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-cni-netd\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424686 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-var-lib-openvswitch\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424702 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-cni-bin\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424749 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-cni-bin\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424783 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-run-netns\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.425394 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/60550f21-b0dd-410b-a4be-cba72e8b7b71-ovnkube-script-lib\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.425426 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-run-systemd\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.425507 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/60550f21-b0dd-410b-a4be-cba72e8b7b71-ovnkube-config\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.425558 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-systemd-units\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.425595 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-run-ovn-kubernetes\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.425631 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.425683 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-run-ovn\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.425715 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-node-log\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.425744 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-run-openvswitch\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.425775 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-log-socket\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.425812 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-etc-openvswitch\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.425834 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-slash\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.425855 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-var-lib-openvswitch\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.425860 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-cni-netd\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.425938 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-kubelet\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.426089 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/60550f21-b0dd-410b-a4be-cba72e8b7b71-env-overrides\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.429357 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/60550f21-b0dd-410b-a4be-cba72e8b7b71-ovn-node-metrics-cert\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.447100 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jxb8\" (UniqueName: \"kubernetes.io/projected/60550f21-b0dd-410b-a4be-cba72e8b7b71-kube-api-access-6jxb8\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.502768 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-vflwv" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.512227 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.849024 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-w85dm_a40805c6-ef8a-4ae0-bb5b-1834d257e8c6/kube-multus/0.log" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.849084 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-w85dm" event={"ID":"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6","Type":"ContainerStarted","Data":"acc666dd42119bae6cd2f607818a28cef7871b82ccc16b234457bb8f06955709"} Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.853357 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zr5bd_ac61c15b-6fe9-4c83-9ca7-588095ab1a29/ovn-acl-logging/0.log" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.853888 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zr5bd_ac61c15b-6fe9-4c83-9ca7-588095ab1a29/ovn-controller/0.log" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.854209 4720 generic.go:334] "Generic (PLEG): container finished" podID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerID="625271ad4ff015cec75e604df3cd6c896dadaa55c6eea3a035e02ffaace285cc" exitCode=0 Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.854229 4720 generic.go:334] "Generic (PLEG): container finished" podID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerID="d1e39c6489267605567083bbf58d52dc83783ab46cf6468a4a619fdd22a893c0" exitCode=0 Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.854324 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.854840 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" event={"ID":"ac61c15b-6fe9-4c83-9ca7-588095ab1a29","Type":"ContainerDied","Data":"625271ad4ff015cec75e604df3cd6c896dadaa55c6eea3a035e02ffaace285cc"} Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.854877 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" event={"ID":"ac61c15b-6fe9-4c83-9ca7-588095ab1a29","Type":"ContainerDied","Data":"d1e39c6489267605567083bbf58d52dc83783ab46cf6468a4a619fdd22a893c0"} Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.854891 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" event={"ID":"ac61c15b-6fe9-4c83-9ca7-588095ab1a29","Type":"ContainerDied","Data":"ca1757282192974108b64124881bc36690fc3400e42954719815f361ddc7c63e"} Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.854909 4720 scope.go:117] "RemoveContainer" containerID="b87e56ea354fc6a8d23bff77c9d817ab2f0481a52e7536b33cff8e842d3acc39" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.856414 4720 generic.go:334] "Generic (PLEG): container finished" podID="60550f21-b0dd-410b-a4be-cba72e8b7b71" containerID="ffa7c67075bb066d8c8ca6301e43653894e8f39c98a52992e9346cf1d8c920e6" exitCode=0 Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.856433 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" event={"ID":"60550f21-b0dd-410b-a4be-cba72e8b7b71","Type":"ContainerDied","Data":"ffa7c67075bb066d8c8ca6301e43653894e8f39c98a52992e9346cf1d8c920e6"} Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.856447 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" event={"ID":"60550f21-b0dd-410b-a4be-cba72e8b7b71","Type":"ContainerStarted","Data":"53cdfe98eeecad403bbed2b730519ce0b8a2a7a987916f6952257d761eb1338b"} Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.880858 4720 scope.go:117] "RemoveContainer" containerID="cc403e7370f4c0e180578f7c937d1c540b58b911d0dcde263edb0332846542ff" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.911809 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zr5bd"] Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.911918 4720 scope.go:117] "RemoveContainer" containerID="625271ad4ff015cec75e604df3cd6c896dadaa55c6eea3a035e02ffaace285cc" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.918681 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zr5bd"] Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.927238 4720 scope.go:117] "RemoveContainer" containerID="d1e39c6489267605567083bbf58d52dc83783ab46cf6468a4a619fdd22a893c0" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.945117 4720 scope.go:117] "RemoveContainer" containerID="3ec88cc895f2b82eea19c4a927605f7196f008760619664f069a2d889cbe5c5a" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.963060 4720 scope.go:117] "RemoveContainer" containerID="4056ab2c599299e02ae3a697f4141c19f2ef8508967564b6eb65b03665cc2136" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.975144 4720 scope.go:117] "RemoveContainer" containerID="259075e5d56293bfdd3c160453e38651da3e67e117efc2e3c9012ac4918c0bf9" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.985882 4720 scope.go:117] "RemoveContainer" containerID="aae9671a8f7e4c811bae63647d082185a7a92f1832f12a8387956c7bb6aab556" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.999025 4720 scope.go:117] "RemoveContainer" containerID="3644c0d176037d9e6f92e84047317c19d376ed8369e5503be68c782d1277bff2" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.011898 4720 scope.go:117] "RemoveContainer" containerID="b87e56ea354fc6a8d23bff77c9d817ab2f0481a52e7536b33cff8e842d3acc39" Jan 21 14:39:45 crc kubenswrapper[4720]: E0121 14:39:45.012608 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b87e56ea354fc6a8d23bff77c9d817ab2f0481a52e7536b33cff8e842d3acc39\": container with ID starting with b87e56ea354fc6a8d23bff77c9d817ab2f0481a52e7536b33cff8e842d3acc39 not found: ID does not exist" containerID="b87e56ea354fc6a8d23bff77c9d817ab2f0481a52e7536b33cff8e842d3acc39" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.012647 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b87e56ea354fc6a8d23bff77c9d817ab2f0481a52e7536b33cff8e842d3acc39"} err="failed to get container status \"b87e56ea354fc6a8d23bff77c9d817ab2f0481a52e7536b33cff8e842d3acc39\": rpc error: code = NotFound desc = could not find container \"b87e56ea354fc6a8d23bff77c9d817ab2f0481a52e7536b33cff8e842d3acc39\": container with ID starting with b87e56ea354fc6a8d23bff77c9d817ab2f0481a52e7536b33cff8e842d3acc39 not found: ID does not exist" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.012694 4720 scope.go:117] "RemoveContainer" containerID="cc403e7370f4c0e180578f7c937d1c540b58b911d0dcde263edb0332846542ff" Jan 21 14:39:45 crc kubenswrapper[4720]: E0121 14:39:45.012923 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc403e7370f4c0e180578f7c937d1c540b58b911d0dcde263edb0332846542ff\": container with ID starting with cc403e7370f4c0e180578f7c937d1c540b58b911d0dcde263edb0332846542ff not found: ID does not exist" containerID="cc403e7370f4c0e180578f7c937d1c540b58b911d0dcde263edb0332846542ff" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.012959 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc403e7370f4c0e180578f7c937d1c540b58b911d0dcde263edb0332846542ff"} err="failed to get container status \"cc403e7370f4c0e180578f7c937d1c540b58b911d0dcde263edb0332846542ff\": rpc error: code = NotFound desc = could not find container \"cc403e7370f4c0e180578f7c937d1c540b58b911d0dcde263edb0332846542ff\": container with ID starting with cc403e7370f4c0e180578f7c937d1c540b58b911d0dcde263edb0332846542ff not found: ID does not exist" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.012974 4720 scope.go:117] "RemoveContainer" containerID="625271ad4ff015cec75e604df3cd6c896dadaa55c6eea3a035e02ffaace285cc" Jan 21 14:39:45 crc kubenswrapper[4720]: E0121 14:39:45.013164 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"625271ad4ff015cec75e604df3cd6c896dadaa55c6eea3a035e02ffaace285cc\": container with ID starting with 625271ad4ff015cec75e604df3cd6c896dadaa55c6eea3a035e02ffaace285cc not found: ID does not exist" containerID="625271ad4ff015cec75e604df3cd6c896dadaa55c6eea3a035e02ffaace285cc" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.013185 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"625271ad4ff015cec75e604df3cd6c896dadaa55c6eea3a035e02ffaace285cc"} err="failed to get container status \"625271ad4ff015cec75e604df3cd6c896dadaa55c6eea3a035e02ffaace285cc\": rpc error: code = NotFound desc = could not find container \"625271ad4ff015cec75e604df3cd6c896dadaa55c6eea3a035e02ffaace285cc\": container with ID starting with 625271ad4ff015cec75e604df3cd6c896dadaa55c6eea3a035e02ffaace285cc not found: ID does not exist" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.013208 4720 scope.go:117] "RemoveContainer" containerID="d1e39c6489267605567083bbf58d52dc83783ab46cf6468a4a619fdd22a893c0" Jan 21 14:39:45 crc kubenswrapper[4720]: E0121 14:39:45.013393 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1e39c6489267605567083bbf58d52dc83783ab46cf6468a4a619fdd22a893c0\": container with ID starting with d1e39c6489267605567083bbf58d52dc83783ab46cf6468a4a619fdd22a893c0 not found: ID does not exist" containerID="d1e39c6489267605567083bbf58d52dc83783ab46cf6468a4a619fdd22a893c0" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.013415 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1e39c6489267605567083bbf58d52dc83783ab46cf6468a4a619fdd22a893c0"} err="failed to get container status \"d1e39c6489267605567083bbf58d52dc83783ab46cf6468a4a619fdd22a893c0\": rpc error: code = NotFound desc = could not find container \"d1e39c6489267605567083bbf58d52dc83783ab46cf6468a4a619fdd22a893c0\": container with ID starting with d1e39c6489267605567083bbf58d52dc83783ab46cf6468a4a619fdd22a893c0 not found: ID does not exist" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.013428 4720 scope.go:117] "RemoveContainer" containerID="3ec88cc895f2b82eea19c4a927605f7196f008760619664f069a2d889cbe5c5a" Jan 21 14:39:45 crc kubenswrapper[4720]: E0121 14:39:45.013608 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ec88cc895f2b82eea19c4a927605f7196f008760619664f069a2d889cbe5c5a\": container with ID starting with 3ec88cc895f2b82eea19c4a927605f7196f008760619664f069a2d889cbe5c5a not found: ID does not exist" containerID="3ec88cc895f2b82eea19c4a927605f7196f008760619664f069a2d889cbe5c5a" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.013629 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ec88cc895f2b82eea19c4a927605f7196f008760619664f069a2d889cbe5c5a"} err="failed to get container status \"3ec88cc895f2b82eea19c4a927605f7196f008760619664f069a2d889cbe5c5a\": rpc error: code = NotFound desc = could not find container \"3ec88cc895f2b82eea19c4a927605f7196f008760619664f069a2d889cbe5c5a\": container with ID starting with 3ec88cc895f2b82eea19c4a927605f7196f008760619664f069a2d889cbe5c5a not found: ID does not exist" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.013668 4720 scope.go:117] "RemoveContainer" containerID="4056ab2c599299e02ae3a697f4141c19f2ef8508967564b6eb65b03665cc2136" Jan 21 14:39:45 crc kubenswrapper[4720]: E0121 14:39:45.014232 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4056ab2c599299e02ae3a697f4141c19f2ef8508967564b6eb65b03665cc2136\": container with ID starting with 4056ab2c599299e02ae3a697f4141c19f2ef8508967564b6eb65b03665cc2136 not found: ID does not exist" containerID="4056ab2c599299e02ae3a697f4141c19f2ef8508967564b6eb65b03665cc2136" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.014252 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4056ab2c599299e02ae3a697f4141c19f2ef8508967564b6eb65b03665cc2136"} err="failed to get container status \"4056ab2c599299e02ae3a697f4141c19f2ef8508967564b6eb65b03665cc2136\": rpc error: code = NotFound desc = could not find container \"4056ab2c599299e02ae3a697f4141c19f2ef8508967564b6eb65b03665cc2136\": container with ID starting with 4056ab2c599299e02ae3a697f4141c19f2ef8508967564b6eb65b03665cc2136 not found: ID does not exist" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.014265 4720 scope.go:117] "RemoveContainer" containerID="259075e5d56293bfdd3c160453e38651da3e67e117efc2e3c9012ac4918c0bf9" Jan 21 14:39:45 crc kubenswrapper[4720]: E0121 14:39:45.014527 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"259075e5d56293bfdd3c160453e38651da3e67e117efc2e3c9012ac4918c0bf9\": container with ID starting with 259075e5d56293bfdd3c160453e38651da3e67e117efc2e3c9012ac4918c0bf9 not found: ID does not exist" containerID="259075e5d56293bfdd3c160453e38651da3e67e117efc2e3c9012ac4918c0bf9" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.014547 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"259075e5d56293bfdd3c160453e38651da3e67e117efc2e3c9012ac4918c0bf9"} err="failed to get container status \"259075e5d56293bfdd3c160453e38651da3e67e117efc2e3c9012ac4918c0bf9\": rpc error: code = NotFound desc = could not find container \"259075e5d56293bfdd3c160453e38651da3e67e117efc2e3c9012ac4918c0bf9\": container with ID starting with 259075e5d56293bfdd3c160453e38651da3e67e117efc2e3c9012ac4918c0bf9 not found: ID does not exist" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.014563 4720 scope.go:117] "RemoveContainer" containerID="aae9671a8f7e4c811bae63647d082185a7a92f1832f12a8387956c7bb6aab556" Jan 21 14:39:45 crc kubenswrapper[4720]: E0121 14:39:45.014855 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aae9671a8f7e4c811bae63647d082185a7a92f1832f12a8387956c7bb6aab556\": container with ID starting with aae9671a8f7e4c811bae63647d082185a7a92f1832f12a8387956c7bb6aab556 not found: ID does not exist" containerID="aae9671a8f7e4c811bae63647d082185a7a92f1832f12a8387956c7bb6aab556" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.014877 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aae9671a8f7e4c811bae63647d082185a7a92f1832f12a8387956c7bb6aab556"} err="failed to get container status \"aae9671a8f7e4c811bae63647d082185a7a92f1832f12a8387956c7bb6aab556\": rpc error: code = NotFound desc = could not find container \"aae9671a8f7e4c811bae63647d082185a7a92f1832f12a8387956c7bb6aab556\": container with ID starting with aae9671a8f7e4c811bae63647d082185a7a92f1832f12a8387956c7bb6aab556 not found: ID does not exist" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.014891 4720 scope.go:117] "RemoveContainer" containerID="3644c0d176037d9e6f92e84047317c19d376ed8369e5503be68c782d1277bff2" Jan 21 14:39:45 crc kubenswrapper[4720]: E0121 14:39:45.015149 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3644c0d176037d9e6f92e84047317c19d376ed8369e5503be68c782d1277bff2\": container with ID starting with 3644c0d176037d9e6f92e84047317c19d376ed8369e5503be68c782d1277bff2 not found: ID does not exist" containerID="3644c0d176037d9e6f92e84047317c19d376ed8369e5503be68c782d1277bff2" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.015175 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3644c0d176037d9e6f92e84047317c19d376ed8369e5503be68c782d1277bff2"} err="failed to get container status \"3644c0d176037d9e6f92e84047317c19d376ed8369e5503be68c782d1277bff2\": rpc error: code = NotFound desc = could not find container \"3644c0d176037d9e6f92e84047317c19d376ed8369e5503be68c782d1277bff2\": container with ID starting with 3644c0d176037d9e6f92e84047317c19d376ed8369e5503be68c782d1277bff2 not found: ID does not exist" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.015192 4720 scope.go:117] "RemoveContainer" containerID="b87e56ea354fc6a8d23bff77c9d817ab2f0481a52e7536b33cff8e842d3acc39" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.016869 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b87e56ea354fc6a8d23bff77c9d817ab2f0481a52e7536b33cff8e842d3acc39"} err="failed to get container status \"b87e56ea354fc6a8d23bff77c9d817ab2f0481a52e7536b33cff8e842d3acc39\": rpc error: code = NotFound desc = could not find container \"b87e56ea354fc6a8d23bff77c9d817ab2f0481a52e7536b33cff8e842d3acc39\": container with ID starting with b87e56ea354fc6a8d23bff77c9d817ab2f0481a52e7536b33cff8e842d3acc39 not found: ID does not exist" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.016898 4720 scope.go:117] "RemoveContainer" containerID="cc403e7370f4c0e180578f7c937d1c540b58b911d0dcde263edb0332846542ff" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.017196 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc403e7370f4c0e180578f7c937d1c540b58b911d0dcde263edb0332846542ff"} err="failed to get container status \"cc403e7370f4c0e180578f7c937d1c540b58b911d0dcde263edb0332846542ff\": rpc error: code = NotFound desc = could not find container \"cc403e7370f4c0e180578f7c937d1c540b58b911d0dcde263edb0332846542ff\": container with ID starting with cc403e7370f4c0e180578f7c937d1c540b58b911d0dcde263edb0332846542ff not found: ID does not exist" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.017221 4720 scope.go:117] "RemoveContainer" containerID="625271ad4ff015cec75e604df3cd6c896dadaa55c6eea3a035e02ffaace285cc" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.017450 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"625271ad4ff015cec75e604df3cd6c896dadaa55c6eea3a035e02ffaace285cc"} err="failed to get container status \"625271ad4ff015cec75e604df3cd6c896dadaa55c6eea3a035e02ffaace285cc\": rpc error: code = NotFound desc = could not find container \"625271ad4ff015cec75e604df3cd6c896dadaa55c6eea3a035e02ffaace285cc\": container with ID starting with 625271ad4ff015cec75e604df3cd6c896dadaa55c6eea3a035e02ffaace285cc not found: ID does not exist" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.017470 4720 scope.go:117] "RemoveContainer" containerID="d1e39c6489267605567083bbf58d52dc83783ab46cf6468a4a619fdd22a893c0" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.017695 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1e39c6489267605567083bbf58d52dc83783ab46cf6468a4a619fdd22a893c0"} err="failed to get container status \"d1e39c6489267605567083bbf58d52dc83783ab46cf6468a4a619fdd22a893c0\": rpc error: code = NotFound desc = could not find container \"d1e39c6489267605567083bbf58d52dc83783ab46cf6468a4a619fdd22a893c0\": container with ID starting with d1e39c6489267605567083bbf58d52dc83783ab46cf6468a4a619fdd22a893c0 not found: ID does not exist" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.017714 4720 scope.go:117] "RemoveContainer" containerID="3ec88cc895f2b82eea19c4a927605f7196f008760619664f069a2d889cbe5c5a" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.017941 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ec88cc895f2b82eea19c4a927605f7196f008760619664f069a2d889cbe5c5a"} err="failed to get container status \"3ec88cc895f2b82eea19c4a927605f7196f008760619664f069a2d889cbe5c5a\": rpc error: code = NotFound desc = could not find container \"3ec88cc895f2b82eea19c4a927605f7196f008760619664f069a2d889cbe5c5a\": container with ID starting with 3ec88cc895f2b82eea19c4a927605f7196f008760619664f069a2d889cbe5c5a not found: ID does not exist" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.017963 4720 scope.go:117] "RemoveContainer" containerID="4056ab2c599299e02ae3a697f4141c19f2ef8508967564b6eb65b03665cc2136" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.018229 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4056ab2c599299e02ae3a697f4141c19f2ef8508967564b6eb65b03665cc2136"} err="failed to get container status \"4056ab2c599299e02ae3a697f4141c19f2ef8508967564b6eb65b03665cc2136\": rpc error: code = NotFound desc = could not find container \"4056ab2c599299e02ae3a697f4141c19f2ef8508967564b6eb65b03665cc2136\": container with ID starting with 4056ab2c599299e02ae3a697f4141c19f2ef8508967564b6eb65b03665cc2136 not found: ID does not exist" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.018247 4720 scope.go:117] "RemoveContainer" containerID="259075e5d56293bfdd3c160453e38651da3e67e117efc2e3c9012ac4918c0bf9" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.018413 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"259075e5d56293bfdd3c160453e38651da3e67e117efc2e3c9012ac4918c0bf9"} err="failed to get container status \"259075e5d56293bfdd3c160453e38651da3e67e117efc2e3c9012ac4918c0bf9\": rpc error: code = NotFound desc = could not find container \"259075e5d56293bfdd3c160453e38651da3e67e117efc2e3c9012ac4918c0bf9\": container with ID starting with 259075e5d56293bfdd3c160453e38651da3e67e117efc2e3c9012ac4918c0bf9 not found: ID does not exist" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.018440 4720 scope.go:117] "RemoveContainer" containerID="aae9671a8f7e4c811bae63647d082185a7a92f1832f12a8387956c7bb6aab556" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.019154 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aae9671a8f7e4c811bae63647d082185a7a92f1832f12a8387956c7bb6aab556"} err="failed to get container status \"aae9671a8f7e4c811bae63647d082185a7a92f1832f12a8387956c7bb6aab556\": rpc error: code = NotFound desc = could not find container \"aae9671a8f7e4c811bae63647d082185a7a92f1832f12a8387956c7bb6aab556\": container with ID starting with aae9671a8f7e4c811bae63647d082185a7a92f1832f12a8387956c7bb6aab556 not found: ID does not exist" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.019239 4720 scope.go:117] "RemoveContainer" containerID="3644c0d176037d9e6f92e84047317c19d376ed8369e5503be68c782d1277bff2" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.019976 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3644c0d176037d9e6f92e84047317c19d376ed8369e5503be68c782d1277bff2"} err="failed to get container status \"3644c0d176037d9e6f92e84047317c19d376ed8369e5503be68c782d1277bff2\": rpc error: code = NotFound desc = could not find container \"3644c0d176037d9e6f92e84047317c19d376ed8369e5503be68c782d1277bff2\": container with ID starting with 3644c0d176037d9e6f92e84047317c19d376ed8369e5503be68c782d1277bff2 not found: ID does not exist" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.881197 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" event={"ID":"60550f21-b0dd-410b-a4be-cba72e8b7b71","Type":"ContainerStarted","Data":"20f752df985e275fc170d7f09f0fda0fa79b977ca7e9c54386b56bf70a664352"} Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.882703 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" event={"ID":"60550f21-b0dd-410b-a4be-cba72e8b7b71","Type":"ContainerStarted","Data":"e6df46ce549cf47027353bd50dfa269d320f9b65242e840b9bca91af4bc8bb02"} Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.882794 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" event={"ID":"60550f21-b0dd-410b-a4be-cba72e8b7b71","Type":"ContainerStarted","Data":"d53f25e750b9c7df4a0a690ac6bb64af4d058ad69480c307e32df5881f6d78c9"} Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.882864 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" event={"ID":"60550f21-b0dd-410b-a4be-cba72e8b7b71","Type":"ContainerStarted","Data":"593c6ed2601f8bcc6a2c5dff92288fba00edade13ad8613b8646f6236049e490"} Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.882934 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" event={"ID":"60550f21-b0dd-410b-a4be-cba72e8b7b71","Type":"ContainerStarted","Data":"1181bddc1a3f560e3a5d183aaf6b578f0be8d4305fa36e63c13e101b915b6d86"} Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.883138 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" event={"ID":"60550f21-b0dd-410b-a4be-cba72e8b7b71","Type":"ContainerStarted","Data":"cbbfa58798f5909f5d588eedb36f15bc96de2bd901929cc01f7e493297c79976"} Jan 21 14:39:46 crc kubenswrapper[4720]: I0121 14:39:46.684831 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" path="/var/lib/kubelet/pods/ac61c15b-6fe9-4c83-9ca7-588095ab1a29/volumes" Jan 21 14:39:48 crc kubenswrapper[4720]: I0121 14:39:48.904149 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" event={"ID":"60550f21-b0dd-410b-a4be-cba72e8b7b71","Type":"ContainerStarted","Data":"fbf18024f843adc9e3c3ccb274c6ee33723a30f7251259ff499f4e13cbd98576"} Jan 21 14:39:52 crc kubenswrapper[4720]: I0121 14:39:52.879729 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:39:52 crc kubenswrapper[4720]: I0121 14:39:52.880254 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:39:52 crc kubenswrapper[4720]: I0121 14:39:52.880311 4720 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:39:52 crc kubenswrapper[4720]: I0121 14:39:52.880980 4720 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a61755d2d50927cd3c032bcad351e940f76beb15aa30f49c45cc8f2e261c405c"} pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 14:39:52 crc kubenswrapper[4720]: I0121 14:39:52.881031 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" containerID="cri-o://a61755d2d50927cd3c032bcad351e940f76beb15aa30f49c45cc8f2e261c405c" gracePeriod=600 Jan 21 14:39:52 crc kubenswrapper[4720]: I0121 14:39:52.948058 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" event={"ID":"60550f21-b0dd-410b-a4be-cba72e8b7b71","Type":"ContainerStarted","Data":"d1e3c4978e0d34c293a9b1d777692003c1bbe558e4cd835ce33d3759a90bad05"} Jan 21 14:39:52 crc kubenswrapper[4720]: I0121 14:39:52.948472 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:52 crc kubenswrapper[4720]: I0121 14:39:52.948572 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:52 crc kubenswrapper[4720]: I0121 14:39:52.980396 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:53 crc kubenswrapper[4720]: I0121 14:39:53.022945 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" podStartSLOduration=9.022928146 podStartE2EDuration="9.022928146s" podCreationTimestamp="2026-01-21 14:39:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:39:52.978240203 +0000 UTC m=+630.886980145" watchObservedRunningTime="2026-01-21 14:39:53.022928146 +0000 UTC m=+630.931668078" Jan 21 14:39:53 crc kubenswrapper[4720]: I0121 14:39:53.955972 4720 generic.go:334] "Generic (PLEG): container finished" podID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerID="a61755d2d50927cd3c032bcad351e940f76beb15aa30f49c45cc8f2e261c405c" exitCode=0 Jan 21 14:39:53 crc kubenswrapper[4720]: I0121 14:39:53.957245 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerDied","Data":"a61755d2d50927cd3c032bcad351e940f76beb15aa30f49c45cc8f2e261c405c"} Jan 21 14:39:53 crc kubenswrapper[4720]: I0121 14:39:53.957281 4720 scope.go:117] "RemoveContainer" containerID="eab7230c9b1780824322550642987ab8759942bce4be148af7dcc4a247edffb1" Jan 21 14:39:53 crc kubenswrapper[4720]: I0121 14:39:53.957385 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:53 crc kubenswrapper[4720]: I0121 14:39:53.989133 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:54 crc kubenswrapper[4720]: I0121 14:39:54.965610 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerStarted","Data":"75aaa3118f909741ad221a6d4a71b9c6e4e33b0de93fc4cf721b556711ea2c47"} Jan 21 14:40:14 crc kubenswrapper[4720]: I0121 14:40:14.538225 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:40:16 crc kubenswrapper[4720]: I0121 14:40:16.323206 4720 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 21 14:40:29 crc kubenswrapper[4720]: I0121 14:40:29.711380 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw"] Jan 21 14:40:29 crc kubenswrapper[4720]: I0121 14:40:29.713141 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw" Jan 21 14:40:29 crc kubenswrapper[4720]: I0121 14:40:29.715066 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 21 14:40:29 crc kubenswrapper[4720]: I0121 14:40:29.729254 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw"] Jan 21 14:40:29 crc kubenswrapper[4720]: I0121 14:40:29.829344 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w99h\" (UniqueName: \"kubernetes.io/projected/d714bdab-c0dc-4710-bae5-ec08841d2c0d-kube-api-access-5w99h\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw\" (UID: \"d714bdab-c0dc-4710-bae5-ec08841d2c0d\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw" Jan 21 14:40:29 crc kubenswrapper[4720]: I0121 14:40:29.829728 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d714bdab-c0dc-4710-bae5-ec08841d2c0d-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw\" (UID: \"d714bdab-c0dc-4710-bae5-ec08841d2c0d\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw" Jan 21 14:40:29 crc kubenswrapper[4720]: I0121 14:40:29.830016 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d714bdab-c0dc-4710-bae5-ec08841d2c0d-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw\" (UID: \"d714bdab-c0dc-4710-bae5-ec08841d2c0d\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw" Jan 21 14:40:29 crc kubenswrapper[4720]: I0121 14:40:29.931920 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5w99h\" (UniqueName: \"kubernetes.io/projected/d714bdab-c0dc-4710-bae5-ec08841d2c0d-kube-api-access-5w99h\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw\" (UID: \"d714bdab-c0dc-4710-bae5-ec08841d2c0d\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw" Jan 21 14:40:29 crc kubenswrapper[4720]: I0121 14:40:29.932044 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d714bdab-c0dc-4710-bae5-ec08841d2c0d-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw\" (UID: \"d714bdab-c0dc-4710-bae5-ec08841d2c0d\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw" Jan 21 14:40:29 crc kubenswrapper[4720]: I0121 14:40:29.932148 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d714bdab-c0dc-4710-bae5-ec08841d2c0d-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw\" (UID: \"d714bdab-c0dc-4710-bae5-ec08841d2c0d\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw" Jan 21 14:40:29 crc kubenswrapper[4720]: I0121 14:40:29.932605 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d714bdab-c0dc-4710-bae5-ec08841d2c0d-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw\" (UID: \"d714bdab-c0dc-4710-bae5-ec08841d2c0d\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw" Jan 21 14:40:29 crc kubenswrapper[4720]: I0121 14:40:29.932896 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d714bdab-c0dc-4710-bae5-ec08841d2c0d-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw\" (UID: \"d714bdab-c0dc-4710-bae5-ec08841d2c0d\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw" Jan 21 14:40:29 crc kubenswrapper[4720]: I0121 14:40:29.963880 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w99h\" (UniqueName: \"kubernetes.io/projected/d714bdab-c0dc-4710-bae5-ec08841d2c0d-kube-api-access-5w99h\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw\" (UID: \"d714bdab-c0dc-4710-bae5-ec08841d2c0d\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw" Jan 21 14:40:30 crc kubenswrapper[4720]: I0121 14:40:30.029528 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw" Jan 21 14:40:30 crc kubenswrapper[4720]: I0121 14:40:30.272062 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw"] Jan 21 14:40:30 crc kubenswrapper[4720]: I0121 14:40:30.965923 4720 generic.go:334] "Generic (PLEG): container finished" podID="d714bdab-c0dc-4710-bae5-ec08841d2c0d" containerID="18b85be22696d7a1e91179c4db1f586e24b4d0f58de335c96bb3ab80b6d2a3b1" exitCode=0 Jan 21 14:40:30 crc kubenswrapper[4720]: I0121 14:40:30.966005 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw" event={"ID":"d714bdab-c0dc-4710-bae5-ec08841d2c0d","Type":"ContainerDied","Data":"18b85be22696d7a1e91179c4db1f586e24b4d0f58de335c96bb3ab80b6d2a3b1"} Jan 21 14:40:30 crc kubenswrapper[4720]: I0121 14:40:30.966241 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw" event={"ID":"d714bdab-c0dc-4710-bae5-ec08841d2c0d","Type":"ContainerStarted","Data":"7dc037c255fe5d539fe598aa5fa5e4707047078369eaf44896cfb3a6c5f1899e"} Jan 21 14:40:32 crc kubenswrapper[4720]: I0121 14:40:32.039443 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ppkd7"] Jan 21 14:40:32 crc kubenswrapper[4720]: I0121 14:40:32.042453 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ppkd7" Jan 21 14:40:32 crc kubenswrapper[4720]: I0121 14:40:32.056105 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ppkd7"] Jan 21 14:40:32 crc kubenswrapper[4720]: I0121 14:40:32.159992 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8v27\" (UniqueName: \"kubernetes.io/projected/55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d-kube-api-access-k8v27\") pod \"redhat-operators-ppkd7\" (UID: \"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d\") " pod="openshift-marketplace/redhat-operators-ppkd7" Jan 21 14:40:32 crc kubenswrapper[4720]: I0121 14:40:32.160073 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d-utilities\") pod \"redhat-operators-ppkd7\" (UID: \"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d\") " pod="openshift-marketplace/redhat-operators-ppkd7" Jan 21 14:40:32 crc kubenswrapper[4720]: I0121 14:40:32.160212 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d-catalog-content\") pod \"redhat-operators-ppkd7\" (UID: \"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d\") " pod="openshift-marketplace/redhat-operators-ppkd7" Jan 21 14:40:32 crc kubenswrapper[4720]: I0121 14:40:32.261302 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d-catalog-content\") pod \"redhat-operators-ppkd7\" (UID: \"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d\") " pod="openshift-marketplace/redhat-operators-ppkd7" Jan 21 14:40:32 crc kubenswrapper[4720]: I0121 14:40:32.261601 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8v27\" (UniqueName: \"kubernetes.io/projected/55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d-kube-api-access-k8v27\") pod \"redhat-operators-ppkd7\" (UID: \"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d\") " pod="openshift-marketplace/redhat-operators-ppkd7" Jan 21 14:40:32 crc kubenswrapper[4720]: I0121 14:40:32.261641 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d-utilities\") pod \"redhat-operators-ppkd7\" (UID: \"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d\") " pod="openshift-marketplace/redhat-operators-ppkd7" Jan 21 14:40:32 crc kubenswrapper[4720]: I0121 14:40:32.261777 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d-catalog-content\") pod \"redhat-operators-ppkd7\" (UID: \"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d\") " pod="openshift-marketplace/redhat-operators-ppkd7" Jan 21 14:40:32 crc kubenswrapper[4720]: I0121 14:40:32.262090 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d-utilities\") pod \"redhat-operators-ppkd7\" (UID: \"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d\") " pod="openshift-marketplace/redhat-operators-ppkd7" Jan 21 14:40:32 crc kubenswrapper[4720]: I0121 14:40:32.282258 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8v27\" (UniqueName: \"kubernetes.io/projected/55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d-kube-api-access-k8v27\") pod \"redhat-operators-ppkd7\" (UID: \"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d\") " pod="openshift-marketplace/redhat-operators-ppkd7" Jan 21 14:40:32 crc kubenswrapper[4720]: I0121 14:40:32.373161 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ppkd7" Jan 21 14:40:32 crc kubenswrapper[4720]: I0121 14:40:32.781884 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ppkd7"] Jan 21 14:40:32 crc kubenswrapper[4720]: I0121 14:40:32.978445 4720 generic.go:334] "Generic (PLEG): container finished" podID="55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d" containerID="e95f0524a8ae71eeafa020f68f9429ac2654410c54f283a742c84a9ce386676d" exitCode=0 Jan 21 14:40:32 crc kubenswrapper[4720]: I0121 14:40:32.978508 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppkd7" event={"ID":"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d","Type":"ContainerDied","Data":"e95f0524a8ae71eeafa020f68f9429ac2654410c54f283a742c84a9ce386676d"} Jan 21 14:40:32 crc kubenswrapper[4720]: I0121 14:40:32.978856 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppkd7" event={"ID":"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d","Type":"ContainerStarted","Data":"b483512bc397003ba094d82107302d250056bd93ed3604599c9f018730b4610a"} Jan 21 14:40:32 crc kubenswrapper[4720]: I0121 14:40:32.981761 4720 generic.go:334] "Generic (PLEG): container finished" podID="d714bdab-c0dc-4710-bae5-ec08841d2c0d" containerID="a02146b329cdb9125ea485ddeff1b2ec486f6e3ba61d778148451d89eb67ef1f" exitCode=0 Jan 21 14:40:32 crc kubenswrapper[4720]: I0121 14:40:32.981792 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw" event={"ID":"d714bdab-c0dc-4710-bae5-ec08841d2c0d","Type":"ContainerDied","Data":"a02146b329cdb9125ea485ddeff1b2ec486f6e3ba61d778148451d89eb67ef1f"} Jan 21 14:40:33 crc kubenswrapper[4720]: I0121 14:40:33.994035 4720 generic.go:334] "Generic (PLEG): container finished" podID="d714bdab-c0dc-4710-bae5-ec08841d2c0d" containerID="b966931718143442e396a12e6ac157054dc8452123b9bf8cc73c2cf135f05ad2" exitCode=0 Jan 21 14:40:33 crc kubenswrapper[4720]: I0121 14:40:33.994098 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw" event={"ID":"d714bdab-c0dc-4710-bae5-ec08841d2c0d","Type":"ContainerDied","Data":"b966931718143442e396a12e6ac157054dc8452123b9bf8cc73c2cf135f05ad2"} Jan 21 14:40:33 crc kubenswrapper[4720]: I0121 14:40:33.996775 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppkd7" event={"ID":"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d","Type":"ContainerStarted","Data":"b37b2645d757618a460ec0ba7037e8f644faae046e2a6225453a408f0c8bb1da"} Jan 21 14:40:35 crc kubenswrapper[4720]: I0121 14:40:35.006288 4720 generic.go:334] "Generic (PLEG): container finished" podID="55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d" containerID="b37b2645d757618a460ec0ba7037e8f644faae046e2a6225453a408f0c8bb1da" exitCode=0 Jan 21 14:40:35 crc kubenswrapper[4720]: I0121 14:40:35.007013 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppkd7" event={"ID":"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d","Type":"ContainerDied","Data":"b37b2645d757618a460ec0ba7037e8f644faae046e2a6225453a408f0c8bb1da"} Jan 21 14:40:35 crc kubenswrapper[4720]: I0121 14:40:35.295118 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw" Jan 21 14:40:35 crc kubenswrapper[4720]: I0121 14:40:35.401903 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d714bdab-c0dc-4710-bae5-ec08841d2c0d-bundle\") pod \"d714bdab-c0dc-4710-bae5-ec08841d2c0d\" (UID: \"d714bdab-c0dc-4710-bae5-ec08841d2c0d\") " Jan 21 14:40:35 crc kubenswrapper[4720]: I0121 14:40:35.402045 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5w99h\" (UniqueName: \"kubernetes.io/projected/d714bdab-c0dc-4710-bae5-ec08841d2c0d-kube-api-access-5w99h\") pod \"d714bdab-c0dc-4710-bae5-ec08841d2c0d\" (UID: \"d714bdab-c0dc-4710-bae5-ec08841d2c0d\") " Jan 21 14:40:35 crc kubenswrapper[4720]: I0121 14:40:35.402085 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d714bdab-c0dc-4710-bae5-ec08841d2c0d-util\") pod \"d714bdab-c0dc-4710-bae5-ec08841d2c0d\" (UID: \"d714bdab-c0dc-4710-bae5-ec08841d2c0d\") " Jan 21 14:40:35 crc kubenswrapper[4720]: I0121 14:40:35.402642 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d714bdab-c0dc-4710-bae5-ec08841d2c0d-bundle" (OuterVolumeSpecName: "bundle") pod "d714bdab-c0dc-4710-bae5-ec08841d2c0d" (UID: "d714bdab-c0dc-4710-bae5-ec08841d2c0d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:40:35 crc kubenswrapper[4720]: I0121 14:40:35.407716 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d714bdab-c0dc-4710-bae5-ec08841d2c0d-kube-api-access-5w99h" (OuterVolumeSpecName: "kube-api-access-5w99h") pod "d714bdab-c0dc-4710-bae5-ec08841d2c0d" (UID: "d714bdab-c0dc-4710-bae5-ec08841d2c0d"). InnerVolumeSpecName "kube-api-access-5w99h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:40:35 crc kubenswrapper[4720]: I0121 14:40:35.421273 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d714bdab-c0dc-4710-bae5-ec08841d2c0d-util" (OuterVolumeSpecName: "util") pod "d714bdab-c0dc-4710-bae5-ec08841d2c0d" (UID: "d714bdab-c0dc-4710-bae5-ec08841d2c0d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:40:35 crc kubenswrapper[4720]: I0121 14:40:35.503668 4720 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d714bdab-c0dc-4710-bae5-ec08841d2c0d-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:40:35 crc kubenswrapper[4720]: I0121 14:40:35.503706 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5w99h\" (UniqueName: \"kubernetes.io/projected/d714bdab-c0dc-4710-bae5-ec08841d2c0d-kube-api-access-5w99h\") on node \"crc\" DevicePath \"\"" Jan 21 14:40:35 crc kubenswrapper[4720]: I0121 14:40:35.503722 4720 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d714bdab-c0dc-4710-bae5-ec08841d2c0d-util\") on node \"crc\" DevicePath \"\"" Jan 21 14:40:36 crc kubenswrapper[4720]: I0121 14:40:36.013118 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw" event={"ID":"d714bdab-c0dc-4710-bae5-ec08841d2c0d","Type":"ContainerDied","Data":"7dc037c255fe5d539fe598aa5fa5e4707047078369eaf44896cfb3a6c5f1899e"} Jan 21 14:40:36 crc kubenswrapper[4720]: I0121 14:40:36.013171 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7dc037c255fe5d539fe598aa5fa5e4707047078369eaf44896cfb3a6c5f1899e" Jan 21 14:40:36 crc kubenswrapper[4720]: I0121 14:40:36.013131 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw" Jan 21 14:40:36 crc kubenswrapper[4720]: I0121 14:40:36.014772 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppkd7" event={"ID":"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d","Type":"ContainerStarted","Data":"5b717aa8166bcced6eba974fa17f7a03f7c406a7aa0f052008d77dd36594e8ee"} Jan 21 14:40:36 crc kubenswrapper[4720]: I0121 14:40:36.037221 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ppkd7" podStartSLOduration=1.488256801 podStartE2EDuration="4.037200697s" podCreationTimestamp="2026-01-21 14:40:32 +0000 UTC" firstStartedPulling="2026-01-21 14:40:32.979785067 +0000 UTC m=+670.888524999" lastFinishedPulling="2026-01-21 14:40:35.528728953 +0000 UTC m=+673.437468895" observedRunningTime="2026-01-21 14:40:36.035970353 +0000 UTC m=+673.944710315" watchObservedRunningTime="2026-01-21 14:40:36.037200697 +0000 UTC m=+673.945940639" Jan 21 14:40:39 crc kubenswrapper[4720]: I0121 14:40:39.948425 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-mclmr"] Jan 21 14:40:39 crc kubenswrapper[4720]: E0121 14:40:39.948698 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d714bdab-c0dc-4710-bae5-ec08841d2c0d" containerName="pull" Jan 21 14:40:39 crc kubenswrapper[4720]: I0121 14:40:39.948714 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="d714bdab-c0dc-4710-bae5-ec08841d2c0d" containerName="pull" Jan 21 14:40:39 crc kubenswrapper[4720]: E0121 14:40:39.948734 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d714bdab-c0dc-4710-bae5-ec08841d2c0d" containerName="util" Jan 21 14:40:39 crc kubenswrapper[4720]: I0121 14:40:39.948742 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="d714bdab-c0dc-4710-bae5-ec08841d2c0d" containerName="util" Jan 21 14:40:39 crc kubenswrapper[4720]: E0121 14:40:39.948753 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d714bdab-c0dc-4710-bae5-ec08841d2c0d" containerName="extract" Jan 21 14:40:39 crc kubenswrapper[4720]: I0121 14:40:39.948762 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="d714bdab-c0dc-4710-bae5-ec08841d2c0d" containerName="extract" Jan 21 14:40:39 crc kubenswrapper[4720]: I0121 14:40:39.948883 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="d714bdab-c0dc-4710-bae5-ec08841d2c0d" containerName="extract" Jan 21 14:40:39 crc kubenswrapper[4720]: I0121 14:40:39.949345 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-mclmr" Jan 21 14:40:39 crc kubenswrapper[4720]: I0121 14:40:39.958150 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Jan 21 14:40:39 crc kubenswrapper[4720]: I0121 14:40:39.960251 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Jan 21 14:40:39 crc kubenswrapper[4720]: I0121 14:40:39.964167 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-bsqcl" Jan 21 14:40:39 crc kubenswrapper[4720]: I0121 14:40:39.965580 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-mclmr"] Jan 21 14:40:40 crc kubenswrapper[4720]: I0121 14:40:40.062157 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqrf2\" (UniqueName: \"kubernetes.io/projected/2bdd7be0-b9cf-4501-9816-87831d74becc-kube-api-access-qqrf2\") pod \"nmstate-operator-646758c888-mclmr\" (UID: \"2bdd7be0-b9cf-4501-9816-87831d74becc\") " pod="openshift-nmstate/nmstate-operator-646758c888-mclmr" Jan 21 14:40:40 crc kubenswrapper[4720]: I0121 14:40:40.163736 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqrf2\" (UniqueName: \"kubernetes.io/projected/2bdd7be0-b9cf-4501-9816-87831d74becc-kube-api-access-qqrf2\") pod \"nmstate-operator-646758c888-mclmr\" (UID: \"2bdd7be0-b9cf-4501-9816-87831d74becc\") " pod="openshift-nmstate/nmstate-operator-646758c888-mclmr" Jan 21 14:40:40 crc kubenswrapper[4720]: I0121 14:40:40.184505 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqrf2\" (UniqueName: \"kubernetes.io/projected/2bdd7be0-b9cf-4501-9816-87831d74becc-kube-api-access-qqrf2\") pod \"nmstate-operator-646758c888-mclmr\" (UID: \"2bdd7be0-b9cf-4501-9816-87831d74becc\") " pod="openshift-nmstate/nmstate-operator-646758c888-mclmr" Jan 21 14:40:40 crc kubenswrapper[4720]: I0121 14:40:40.300452 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-mclmr" Jan 21 14:40:40 crc kubenswrapper[4720]: I0121 14:40:40.694005 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-mclmr"] Jan 21 14:40:40 crc kubenswrapper[4720]: W0121 14:40:40.702633 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bdd7be0_b9cf_4501_9816_87831d74becc.slice/crio-f3fc166b7896a8194e4111a838e6cb0e94e68a03aca1ce44a67721ee703bf85e WatchSource:0}: Error finding container f3fc166b7896a8194e4111a838e6cb0e94e68a03aca1ce44a67721ee703bf85e: Status 404 returned error can't find the container with id f3fc166b7896a8194e4111a838e6cb0e94e68a03aca1ce44a67721ee703bf85e Jan 21 14:40:41 crc kubenswrapper[4720]: I0121 14:40:41.041395 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-mclmr" event={"ID":"2bdd7be0-b9cf-4501-9816-87831d74becc","Type":"ContainerStarted","Data":"f3fc166b7896a8194e4111a838e6cb0e94e68a03aca1ce44a67721ee703bf85e"} Jan 21 14:40:42 crc kubenswrapper[4720]: I0121 14:40:42.373537 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ppkd7" Jan 21 14:40:42 crc kubenswrapper[4720]: I0121 14:40:42.373945 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ppkd7" Jan 21 14:40:42 crc kubenswrapper[4720]: I0121 14:40:42.433043 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ppkd7" Jan 21 14:40:43 crc kubenswrapper[4720]: I0121 14:40:43.118065 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ppkd7" Jan 21 14:40:44 crc kubenswrapper[4720]: I0121 14:40:44.825165 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ppkd7"] Jan 21 14:40:45 crc kubenswrapper[4720]: I0121 14:40:45.066579 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-mclmr" event={"ID":"2bdd7be0-b9cf-4501-9816-87831d74becc","Type":"ContainerStarted","Data":"4a394ea7c9dd01b0b3fdaa7b8a60225bc738f14651f025e0d795b97cfa1cda8e"} Jan 21 14:40:45 crc kubenswrapper[4720]: I0121 14:40:45.091813 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-646758c888-mclmr" podStartSLOduration=2.588743798 podStartE2EDuration="6.091782446s" podCreationTimestamp="2026-01-21 14:40:39 +0000 UTC" firstStartedPulling="2026-01-21 14:40:40.705236204 +0000 UTC m=+678.613976126" lastFinishedPulling="2026-01-21 14:40:44.208274842 +0000 UTC m=+682.117014774" observedRunningTime="2026-01-21 14:40:45.082354407 +0000 UTC m=+682.991094379" watchObservedRunningTime="2026-01-21 14:40:45.091782446 +0000 UTC m=+683.000522438" Jan 21 14:40:46 crc kubenswrapper[4720]: I0121 14:40:46.072827 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ppkd7" podUID="55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d" containerName="registry-server" containerID="cri-o://5b717aa8166bcced6eba974fa17f7a03f7c406a7aa0f052008d77dd36594e8ee" gracePeriod=2 Jan 21 14:40:47 crc kubenswrapper[4720]: I0121 14:40:47.606856 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ppkd7" Jan 21 14:40:47 crc kubenswrapper[4720]: I0121 14:40:47.658313 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8v27\" (UniqueName: \"kubernetes.io/projected/55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d-kube-api-access-k8v27\") pod \"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d\" (UID: \"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d\") " Jan 21 14:40:47 crc kubenswrapper[4720]: I0121 14:40:47.658408 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d-utilities\") pod \"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d\" (UID: \"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d\") " Jan 21 14:40:47 crc kubenswrapper[4720]: I0121 14:40:47.658460 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d-catalog-content\") pod \"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d\" (UID: \"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d\") " Jan 21 14:40:47 crc kubenswrapper[4720]: I0121 14:40:47.665259 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d-kube-api-access-k8v27" (OuterVolumeSpecName: "kube-api-access-k8v27") pod "55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d" (UID: "55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d"). InnerVolumeSpecName "kube-api-access-k8v27". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:40:47 crc kubenswrapper[4720]: I0121 14:40:47.674877 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d-utilities" (OuterVolumeSpecName: "utilities") pod "55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d" (UID: "55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:40:47 crc kubenswrapper[4720]: I0121 14:40:47.761396 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8v27\" (UniqueName: \"kubernetes.io/projected/55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d-kube-api-access-k8v27\") on node \"crc\" DevicePath \"\"" Jan 21 14:40:47 crc kubenswrapper[4720]: I0121 14:40:47.761436 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 14:40:47 crc kubenswrapper[4720]: I0121 14:40:47.773410 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d" (UID: "55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:40:47 crc kubenswrapper[4720]: I0121 14:40:47.862747 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 14:40:48 crc kubenswrapper[4720]: I0121 14:40:48.091634 4720 generic.go:334] "Generic (PLEG): container finished" podID="55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d" containerID="5b717aa8166bcced6eba974fa17f7a03f7c406a7aa0f052008d77dd36594e8ee" exitCode=0 Jan 21 14:40:48 crc kubenswrapper[4720]: I0121 14:40:48.091737 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ppkd7" Jan 21 14:40:48 crc kubenswrapper[4720]: I0121 14:40:48.091744 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppkd7" event={"ID":"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d","Type":"ContainerDied","Data":"5b717aa8166bcced6eba974fa17f7a03f7c406a7aa0f052008d77dd36594e8ee"} Jan 21 14:40:48 crc kubenswrapper[4720]: I0121 14:40:48.092079 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppkd7" event={"ID":"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d","Type":"ContainerDied","Data":"b483512bc397003ba094d82107302d250056bd93ed3604599c9f018730b4610a"} Jan 21 14:40:48 crc kubenswrapper[4720]: I0121 14:40:48.092153 4720 scope.go:117] "RemoveContainer" containerID="5b717aa8166bcced6eba974fa17f7a03f7c406a7aa0f052008d77dd36594e8ee" Jan 21 14:40:48 crc kubenswrapper[4720]: I0121 14:40:48.120318 4720 scope.go:117] "RemoveContainer" containerID="b37b2645d757618a460ec0ba7037e8f644faae046e2a6225453a408f0c8bb1da" Jan 21 14:40:48 crc kubenswrapper[4720]: I0121 14:40:48.132405 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ppkd7"] Jan 21 14:40:48 crc kubenswrapper[4720]: I0121 14:40:48.145336 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ppkd7"] Jan 21 14:40:48 crc kubenswrapper[4720]: I0121 14:40:48.155069 4720 scope.go:117] "RemoveContainer" containerID="e95f0524a8ae71eeafa020f68f9429ac2654410c54f283a742c84a9ce386676d" Jan 21 14:40:48 crc kubenswrapper[4720]: I0121 14:40:48.175140 4720 scope.go:117] "RemoveContainer" containerID="5b717aa8166bcced6eba974fa17f7a03f7c406a7aa0f052008d77dd36594e8ee" Jan 21 14:40:48 crc kubenswrapper[4720]: E0121 14:40:48.175680 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b717aa8166bcced6eba974fa17f7a03f7c406a7aa0f052008d77dd36594e8ee\": container with ID starting with 5b717aa8166bcced6eba974fa17f7a03f7c406a7aa0f052008d77dd36594e8ee not found: ID does not exist" containerID="5b717aa8166bcced6eba974fa17f7a03f7c406a7aa0f052008d77dd36594e8ee" Jan 21 14:40:48 crc kubenswrapper[4720]: I0121 14:40:48.175720 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b717aa8166bcced6eba974fa17f7a03f7c406a7aa0f052008d77dd36594e8ee"} err="failed to get container status \"5b717aa8166bcced6eba974fa17f7a03f7c406a7aa0f052008d77dd36594e8ee\": rpc error: code = NotFound desc = could not find container \"5b717aa8166bcced6eba974fa17f7a03f7c406a7aa0f052008d77dd36594e8ee\": container with ID starting with 5b717aa8166bcced6eba974fa17f7a03f7c406a7aa0f052008d77dd36594e8ee not found: ID does not exist" Jan 21 14:40:48 crc kubenswrapper[4720]: I0121 14:40:48.175745 4720 scope.go:117] "RemoveContainer" containerID="b37b2645d757618a460ec0ba7037e8f644faae046e2a6225453a408f0c8bb1da" Jan 21 14:40:48 crc kubenswrapper[4720]: E0121 14:40:48.176078 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b37b2645d757618a460ec0ba7037e8f644faae046e2a6225453a408f0c8bb1da\": container with ID starting with b37b2645d757618a460ec0ba7037e8f644faae046e2a6225453a408f0c8bb1da not found: ID does not exist" containerID="b37b2645d757618a460ec0ba7037e8f644faae046e2a6225453a408f0c8bb1da" Jan 21 14:40:48 crc kubenswrapper[4720]: I0121 14:40:48.176109 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b37b2645d757618a460ec0ba7037e8f644faae046e2a6225453a408f0c8bb1da"} err="failed to get container status \"b37b2645d757618a460ec0ba7037e8f644faae046e2a6225453a408f0c8bb1da\": rpc error: code = NotFound desc = could not find container \"b37b2645d757618a460ec0ba7037e8f644faae046e2a6225453a408f0c8bb1da\": container with ID starting with b37b2645d757618a460ec0ba7037e8f644faae046e2a6225453a408f0c8bb1da not found: ID does not exist" Jan 21 14:40:48 crc kubenswrapper[4720]: I0121 14:40:48.176127 4720 scope.go:117] "RemoveContainer" containerID="e95f0524a8ae71eeafa020f68f9429ac2654410c54f283a742c84a9ce386676d" Jan 21 14:40:48 crc kubenswrapper[4720]: E0121 14:40:48.176526 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e95f0524a8ae71eeafa020f68f9429ac2654410c54f283a742c84a9ce386676d\": container with ID starting with e95f0524a8ae71eeafa020f68f9429ac2654410c54f283a742c84a9ce386676d not found: ID does not exist" containerID="e95f0524a8ae71eeafa020f68f9429ac2654410c54f283a742c84a9ce386676d" Jan 21 14:40:48 crc kubenswrapper[4720]: I0121 14:40:48.176554 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e95f0524a8ae71eeafa020f68f9429ac2654410c54f283a742c84a9ce386676d"} err="failed to get container status \"e95f0524a8ae71eeafa020f68f9429ac2654410c54f283a742c84a9ce386676d\": rpc error: code = NotFound desc = could not find container \"e95f0524a8ae71eeafa020f68f9429ac2654410c54f283a742c84a9ce386676d\": container with ID starting with e95f0524a8ae71eeafa020f68f9429ac2654410c54f283a742c84a9ce386676d not found: ID does not exist" Jan 21 14:40:48 crc kubenswrapper[4720]: I0121 14:40:48.692180 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d" path="/var/lib/kubelet/pods/55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d/volumes" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.032901 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-j9dxt"] Jan 21 14:40:50 crc kubenswrapper[4720]: E0121 14:40:50.033200 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d" containerName="extract-content" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.033236 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d" containerName="extract-content" Jan 21 14:40:50 crc kubenswrapper[4720]: E0121 14:40:50.033263 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d" containerName="registry-server" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.033273 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d" containerName="registry-server" Jan 21 14:40:50 crc kubenswrapper[4720]: E0121 14:40:50.033286 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d" containerName="extract-utilities" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.033294 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d" containerName="extract-utilities" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.033432 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d" containerName="registry-server" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.034238 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-j9dxt" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.038995 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-msgbw" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.043096 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-xcckr"] Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.043861 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-xcckr" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.070464 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.078622 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-l74mh"] Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.079357 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-l74mh" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.093392 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wb6w\" (UniqueName: \"kubernetes.io/projected/c338dc84-0c3a-44c4-8f08-82001f532c2b-kube-api-access-7wb6w\") pod \"nmstate-webhook-8474b5b9d8-xcckr\" (UID: \"c338dc84-0c3a-44c4-8f08-82001f532c2b\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-xcckr" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.093445 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj5n5\" (UniqueName: \"kubernetes.io/projected/a26c9332-5a74-49a3-8347-45ae67cb1c90-kube-api-access-nj5n5\") pod \"nmstate-metrics-54757c584b-j9dxt\" (UID: \"a26c9332-5a74-49a3-8347-45ae67cb1c90\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-j9dxt" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.093506 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/c338dc84-0c3a-44c4-8f08-82001f532c2b-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-xcckr\" (UID: \"c338dc84-0c3a-44c4-8f08-82001f532c2b\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-xcckr" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.099803 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-j9dxt"] Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.108071 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-xcckr"] Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.195285 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/da16493b-aa03-4556-b3ce-d87ccfdbba70-dbus-socket\") pod \"nmstate-handler-l74mh\" (UID: \"da16493b-aa03-4556-b3ce-d87ccfdbba70\") " pod="openshift-nmstate/nmstate-handler-l74mh" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.195348 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/da16493b-aa03-4556-b3ce-d87ccfdbba70-ovs-socket\") pod \"nmstate-handler-l74mh\" (UID: \"da16493b-aa03-4556-b3ce-d87ccfdbba70\") " pod="openshift-nmstate/nmstate-handler-l74mh" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.195391 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wb6w\" (UniqueName: \"kubernetes.io/projected/c338dc84-0c3a-44c4-8f08-82001f532c2b-kube-api-access-7wb6w\") pod \"nmstate-webhook-8474b5b9d8-xcckr\" (UID: \"c338dc84-0c3a-44c4-8f08-82001f532c2b\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-xcckr" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.195424 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj5n5\" (UniqueName: \"kubernetes.io/projected/a26c9332-5a74-49a3-8347-45ae67cb1c90-kube-api-access-nj5n5\") pod \"nmstate-metrics-54757c584b-j9dxt\" (UID: \"a26c9332-5a74-49a3-8347-45ae67cb1c90\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-j9dxt" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.195449 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7bbp\" (UniqueName: \"kubernetes.io/projected/da16493b-aa03-4556-b3ce-d87ccfdbba70-kube-api-access-b7bbp\") pod \"nmstate-handler-l74mh\" (UID: \"da16493b-aa03-4556-b3ce-d87ccfdbba70\") " pod="openshift-nmstate/nmstate-handler-l74mh" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.195496 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/da16493b-aa03-4556-b3ce-d87ccfdbba70-nmstate-lock\") pod \"nmstate-handler-l74mh\" (UID: \"da16493b-aa03-4556-b3ce-d87ccfdbba70\") " pod="openshift-nmstate/nmstate-handler-l74mh" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.195544 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/c338dc84-0c3a-44c4-8f08-82001f532c2b-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-xcckr\" (UID: \"c338dc84-0c3a-44c4-8f08-82001f532c2b\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-xcckr" Jan 21 14:40:50 crc kubenswrapper[4720]: E0121 14:40:50.195674 4720 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Jan 21 14:40:50 crc kubenswrapper[4720]: E0121 14:40:50.195724 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c338dc84-0c3a-44c4-8f08-82001f532c2b-tls-key-pair podName:c338dc84-0c3a-44c4-8f08-82001f532c2b nodeName:}" failed. No retries permitted until 2026-01-21 14:40:50.695703574 +0000 UTC m=+688.604443506 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/c338dc84-0c3a-44c4-8f08-82001f532c2b-tls-key-pair") pod "nmstate-webhook-8474b5b9d8-xcckr" (UID: "c338dc84-0c3a-44c4-8f08-82001f532c2b") : secret "openshift-nmstate-webhook" not found Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.223868 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj5n5\" (UniqueName: \"kubernetes.io/projected/a26c9332-5a74-49a3-8347-45ae67cb1c90-kube-api-access-nj5n5\") pod \"nmstate-metrics-54757c584b-j9dxt\" (UID: \"a26c9332-5a74-49a3-8347-45ae67cb1c90\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-j9dxt" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.227594 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wb6w\" (UniqueName: \"kubernetes.io/projected/c338dc84-0c3a-44c4-8f08-82001f532c2b-kube-api-access-7wb6w\") pod \"nmstate-webhook-8474b5b9d8-xcckr\" (UID: \"c338dc84-0c3a-44c4-8f08-82001f532c2b\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-xcckr" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.228064 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-f9sxz"] Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.228908 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-f9sxz" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.231242 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.234807 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.235020 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-shtmv" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.283756 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-f9sxz"] Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.296451 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7bbp\" (UniqueName: \"kubernetes.io/projected/da16493b-aa03-4556-b3ce-d87ccfdbba70-kube-api-access-b7bbp\") pod \"nmstate-handler-l74mh\" (UID: \"da16493b-aa03-4556-b3ce-d87ccfdbba70\") " pod="openshift-nmstate/nmstate-handler-l74mh" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.296520 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/da16493b-aa03-4556-b3ce-d87ccfdbba70-nmstate-lock\") pod \"nmstate-handler-l74mh\" (UID: \"da16493b-aa03-4556-b3ce-d87ccfdbba70\") " pod="openshift-nmstate/nmstate-handler-l74mh" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.296576 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e3d11ff0-1741-4f0d-aa50-6e0144e843a6-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-f9sxz\" (UID: \"e3d11ff0-1741-4f0d-aa50-6e0144e843a6\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-f9sxz" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.296598 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e3d11ff0-1741-4f0d-aa50-6e0144e843a6-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-f9sxz\" (UID: \"e3d11ff0-1741-4f0d-aa50-6e0144e843a6\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-f9sxz" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.296614 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/da16493b-aa03-4556-b3ce-d87ccfdbba70-dbus-socket\") pod \"nmstate-handler-l74mh\" (UID: \"da16493b-aa03-4556-b3ce-d87ccfdbba70\") " pod="openshift-nmstate/nmstate-handler-l74mh" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.296613 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/da16493b-aa03-4556-b3ce-d87ccfdbba70-nmstate-lock\") pod \"nmstate-handler-l74mh\" (UID: \"da16493b-aa03-4556-b3ce-d87ccfdbba70\") " pod="openshift-nmstate/nmstate-handler-l74mh" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.296635 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m88pd\" (UniqueName: \"kubernetes.io/projected/e3d11ff0-1741-4f0d-aa50-6e0144e843a6-kube-api-access-m88pd\") pod \"nmstate-console-plugin-7754f76f8b-f9sxz\" (UID: \"e3d11ff0-1741-4f0d-aa50-6e0144e843a6\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-f9sxz" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.296751 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/da16493b-aa03-4556-b3ce-d87ccfdbba70-ovs-socket\") pod \"nmstate-handler-l74mh\" (UID: \"da16493b-aa03-4556-b3ce-d87ccfdbba70\") " pod="openshift-nmstate/nmstate-handler-l74mh" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.296879 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/da16493b-aa03-4556-b3ce-d87ccfdbba70-ovs-socket\") pod \"nmstate-handler-l74mh\" (UID: \"da16493b-aa03-4556-b3ce-d87ccfdbba70\") " pod="openshift-nmstate/nmstate-handler-l74mh" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.296912 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/da16493b-aa03-4556-b3ce-d87ccfdbba70-dbus-socket\") pod \"nmstate-handler-l74mh\" (UID: \"da16493b-aa03-4556-b3ce-d87ccfdbba70\") " pod="openshift-nmstate/nmstate-handler-l74mh" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.316372 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7bbp\" (UniqueName: \"kubernetes.io/projected/da16493b-aa03-4556-b3ce-d87ccfdbba70-kube-api-access-b7bbp\") pod \"nmstate-handler-l74mh\" (UID: \"da16493b-aa03-4556-b3ce-d87ccfdbba70\") " pod="openshift-nmstate/nmstate-handler-l74mh" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.385062 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-j9dxt" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.399462 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e3d11ff0-1741-4f0d-aa50-6e0144e843a6-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-f9sxz\" (UID: \"e3d11ff0-1741-4f0d-aa50-6e0144e843a6\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-f9sxz" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.399501 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e3d11ff0-1741-4f0d-aa50-6e0144e843a6-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-f9sxz\" (UID: \"e3d11ff0-1741-4f0d-aa50-6e0144e843a6\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-f9sxz" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.399527 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m88pd\" (UniqueName: \"kubernetes.io/projected/e3d11ff0-1741-4f0d-aa50-6e0144e843a6-kube-api-access-m88pd\") pod \"nmstate-console-plugin-7754f76f8b-f9sxz\" (UID: \"e3d11ff0-1741-4f0d-aa50-6e0144e843a6\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-f9sxz" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.400318 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e3d11ff0-1741-4f0d-aa50-6e0144e843a6-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-f9sxz\" (UID: \"e3d11ff0-1741-4f0d-aa50-6e0144e843a6\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-f9sxz" Jan 21 14:40:50 crc kubenswrapper[4720]: E0121 14:40:50.400399 4720 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Jan 21 14:40:50 crc kubenswrapper[4720]: E0121 14:40:50.400438 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3d11ff0-1741-4f0d-aa50-6e0144e843a6-plugin-serving-cert podName:e3d11ff0-1741-4f0d-aa50-6e0144e843a6 nodeName:}" failed. No retries permitted until 2026-01-21 14:40:50.900425231 +0000 UTC m=+688.809165163 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/e3d11ff0-1741-4f0d-aa50-6e0144e843a6-plugin-serving-cert") pod "nmstate-console-plugin-7754f76f8b-f9sxz" (UID: "e3d11ff0-1741-4f0d-aa50-6e0144e843a6") : secret "plugin-serving-cert" not found Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.411346 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-l74mh" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.434706 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m88pd\" (UniqueName: \"kubernetes.io/projected/e3d11ff0-1741-4f0d-aa50-6e0144e843a6-kube-api-access-m88pd\") pod \"nmstate-console-plugin-7754f76f8b-f9sxz\" (UID: \"e3d11ff0-1741-4f0d-aa50-6e0144e843a6\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-f9sxz" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.439941 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-85d688dff7-p76qd"] Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.440540 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.457483 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85d688dff7-p76qd"] Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.500529 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c44d\" (UniqueName: \"kubernetes.io/projected/c3e9bed0-25b4-4616-a0f2-44bd9950735a-kube-api-access-6c44d\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.500582 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3e9bed0-25b4-4616-a0f2-44bd9950735a-trusted-ca-bundle\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.500629 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c3e9bed0-25b4-4616-a0f2-44bd9950735a-service-ca\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.500712 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c3e9bed0-25b4-4616-a0f2-44bd9950735a-console-serving-cert\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.500769 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c3e9bed0-25b4-4616-a0f2-44bd9950735a-oauth-serving-cert\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.500866 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c3e9bed0-25b4-4616-a0f2-44bd9950735a-console-config\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.500955 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c3e9bed0-25b4-4616-a0f2-44bd9950735a-console-oauth-config\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.602185 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c3e9bed0-25b4-4616-a0f2-44bd9950735a-oauth-serving-cert\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.602551 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c3e9bed0-25b4-4616-a0f2-44bd9950735a-console-config\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.602612 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c3e9bed0-25b4-4616-a0f2-44bd9950735a-console-oauth-config\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.602688 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c44d\" (UniqueName: \"kubernetes.io/projected/c3e9bed0-25b4-4616-a0f2-44bd9950735a-kube-api-access-6c44d\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.602706 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3e9bed0-25b4-4616-a0f2-44bd9950735a-trusted-ca-bundle\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.603997 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c3e9bed0-25b4-4616-a0f2-44bd9950735a-service-ca\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.604026 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c3e9bed0-25b4-4616-a0f2-44bd9950735a-console-serving-cert\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.604631 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3e9bed0-25b4-4616-a0f2-44bd9950735a-trusted-ca-bundle\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.605323 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c3e9bed0-25b4-4616-a0f2-44bd9950735a-service-ca\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.605453 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c3e9bed0-25b4-4616-a0f2-44bd9950735a-console-config\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.607938 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c3e9bed0-25b4-4616-a0f2-44bd9950735a-oauth-serving-cert\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.608119 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c3e9bed0-25b4-4616-a0f2-44bd9950735a-console-serving-cert\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.610197 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c3e9bed0-25b4-4616-a0f2-44bd9950735a-console-oauth-config\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.620476 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c44d\" (UniqueName: \"kubernetes.io/projected/c3e9bed0-25b4-4616-a0f2-44bd9950735a-kube-api-access-6c44d\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.705480 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/c338dc84-0c3a-44c4-8f08-82001f532c2b-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-xcckr\" (UID: \"c338dc84-0c3a-44c4-8f08-82001f532c2b\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-xcckr" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.708349 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/c338dc84-0c3a-44c4-8f08-82001f532c2b-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-xcckr\" (UID: \"c338dc84-0c3a-44c4-8f08-82001f532c2b\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-xcckr" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.767251 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.847123 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-j9dxt"] Jan 21 14:40:50 crc kubenswrapper[4720]: W0121 14:40:50.852528 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda26c9332_5a74_49a3_8347_45ae67cb1c90.slice/crio-a8942c9fa1f3f0ea00278be371f291668f10bca3120dfd7e12b6acb07611a68f WatchSource:0}: Error finding container a8942c9fa1f3f0ea00278be371f291668f10bca3120dfd7e12b6acb07611a68f: Status 404 returned error can't find the container with id a8942c9fa1f3f0ea00278be371f291668f10bca3120dfd7e12b6acb07611a68f Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.910003 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e3d11ff0-1741-4f0d-aa50-6e0144e843a6-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-f9sxz\" (UID: \"e3d11ff0-1741-4f0d-aa50-6e0144e843a6\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-f9sxz" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.914696 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e3d11ff0-1741-4f0d-aa50-6e0144e843a6-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-f9sxz\" (UID: \"e3d11ff0-1741-4f0d-aa50-6e0144e843a6\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-f9sxz" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.999397 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85d688dff7-p76qd"] Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.999644 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-xcckr" Jan 21 14:40:51 crc kubenswrapper[4720]: I0121 14:40:51.131640 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-l74mh" event={"ID":"da16493b-aa03-4556-b3ce-d87ccfdbba70","Type":"ContainerStarted","Data":"3608421bc3bbf14b1488c1288b54ed71445a35fb46df1d0473279fc089317f50"} Jan 21 14:40:51 crc kubenswrapper[4720]: I0121 14:40:51.135917 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85d688dff7-p76qd" event={"ID":"c3e9bed0-25b4-4616-a0f2-44bd9950735a","Type":"ContainerStarted","Data":"30f95b7fd6a59d9c58be086c6ff7c2bdb9861aeb840e7c98162b7d2846eee49f"} Jan 21 14:40:51 crc kubenswrapper[4720]: I0121 14:40:51.142504 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-j9dxt" event={"ID":"a26c9332-5a74-49a3-8347-45ae67cb1c90","Type":"ContainerStarted","Data":"a8942c9fa1f3f0ea00278be371f291668f10bca3120dfd7e12b6acb07611a68f"} Jan 21 14:40:51 crc kubenswrapper[4720]: I0121 14:40:51.175093 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-xcckr"] Jan 21 14:40:51 crc kubenswrapper[4720]: W0121 14:40:51.179418 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc338dc84_0c3a_44c4_8f08_82001f532c2b.slice/crio-e000b77831ffb4c7c8c77500e18dc7fe469e7fef951d3e3495de89f2e7b22cdb WatchSource:0}: Error finding container e000b77831ffb4c7c8c77500e18dc7fe469e7fef951d3e3495de89f2e7b22cdb: Status 404 returned error can't find the container with id e000b77831ffb4c7c8c77500e18dc7fe469e7fef951d3e3495de89f2e7b22cdb Jan 21 14:40:51 crc kubenswrapper[4720]: I0121 14:40:51.187561 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-f9sxz" Jan 21 14:40:51 crc kubenswrapper[4720]: I0121 14:40:51.347482 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-f9sxz"] Jan 21 14:40:52 crc kubenswrapper[4720]: I0121 14:40:52.148682 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-xcckr" event={"ID":"c338dc84-0c3a-44c4-8f08-82001f532c2b","Type":"ContainerStarted","Data":"e000b77831ffb4c7c8c77500e18dc7fe469e7fef951d3e3495de89f2e7b22cdb"} Jan 21 14:40:52 crc kubenswrapper[4720]: I0121 14:40:52.150234 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-f9sxz" event={"ID":"e3d11ff0-1741-4f0d-aa50-6e0144e843a6","Type":"ContainerStarted","Data":"39a30cf9fcd6ba2b91e5e4b720fa7877f84ef0dda663c4815eb617330ea1de1b"} Jan 21 14:40:52 crc kubenswrapper[4720]: I0121 14:40:52.152558 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85d688dff7-p76qd" event={"ID":"c3e9bed0-25b4-4616-a0f2-44bd9950735a","Type":"ContainerStarted","Data":"6e6863e309843893eab584e997d0e46fc9f1e40de1882f6dcc8dcbf411b19942"} Jan 21 14:40:52 crc kubenswrapper[4720]: I0121 14:40:52.178302 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-85d688dff7-p76qd" podStartSLOduration=2.178281668 podStartE2EDuration="2.178281668s" podCreationTimestamp="2026-01-21 14:40:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:40:52.170862334 +0000 UTC m=+690.079602276" watchObservedRunningTime="2026-01-21 14:40:52.178281668 +0000 UTC m=+690.087021600" Jan 21 14:40:55 crc kubenswrapper[4720]: I0121 14:40:55.170585 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-j9dxt" event={"ID":"a26c9332-5a74-49a3-8347-45ae67cb1c90","Type":"ContainerStarted","Data":"609fa9440b839fda98e31da0aaea3d146ccc38efcd5f5dce5a7909cab95a01ed"} Jan 21 14:40:55 crc kubenswrapper[4720]: I0121 14:40:55.173400 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-f9sxz" event={"ID":"e3d11ff0-1741-4f0d-aa50-6e0144e843a6","Type":"ContainerStarted","Data":"1b55122c65ef537e2f7d4eb8f11ce825b69eb82673e877e3e769fd2951c26ea8"} Jan 21 14:40:55 crc kubenswrapper[4720]: I0121 14:40:55.175595 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-l74mh" event={"ID":"da16493b-aa03-4556-b3ce-d87ccfdbba70","Type":"ContainerStarted","Data":"51e7f37323fdf4b7ed7edfbdf268006aebd143b1f8de67b38812f9745a65b891"} Jan 21 14:40:55 crc kubenswrapper[4720]: I0121 14:40:55.175838 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-l74mh" Jan 21 14:40:55 crc kubenswrapper[4720]: I0121 14:40:55.178181 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-xcckr" event={"ID":"c338dc84-0c3a-44c4-8f08-82001f532c2b","Type":"ContainerStarted","Data":"b7abfc80b708d9aabad82f3e03ab3091cd274b923e9c304125a7639791b500ef"} Jan 21 14:40:55 crc kubenswrapper[4720]: I0121 14:40:55.178362 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-xcckr" Jan 21 14:40:55 crc kubenswrapper[4720]: I0121 14:40:55.192938 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-f9sxz" podStartSLOduration=2.454574379 podStartE2EDuration="5.192917183s" podCreationTimestamp="2026-01-21 14:40:50 +0000 UTC" firstStartedPulling="2026-01-21 14:40:51.35405548 +0000 UTC m=+689.262795412" lastFinishedPulling="2026-01-21 14:40:54.092398274 +0000 UTC m=+692.001138216" observedRunningTime="2026-01-21 14:40:55.189768867 +0000 UTC m=+693.098508839" watchObservedRunningTime="2026-01-21 14:40:55.192917183 +0000 UTC m=+693.101657135" Jan 21 14:40:55 crc kubenswrapper[4720]: I0121 14:40:55.248543 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-xcckr" podStartSLOduration=2.330175158 podStartE2EDuration="5.24852013s" podCreationTimestamp="2026-01-21 14:40:50 +0000 UTC" firstStartedPulling="2026-01-21 14:40:51.181061894 +0000 UTC m=+689.089801826" lastFinishedPulling="2026-01-21 14:40:54.099406856 +0000 UTC m=+692.008146798" observedRunningTime="2026-01-21 14:40:55.244444158 +0000 UTC m=+693.153184090" watchObservedRunningTime="2026-01-21 14:40:55.24852013 +0000 UTC m=+693.157260072" Jan 21 14:40:55 crc kubenswrapper[4720]: I0121 14:40:55.252752 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-l74mh" podStartSLOduration=1.622857306 podStartE2EDuration="5.252741355s" podCreationTimestamp="2026-01-21 14:40:50 +0000 UTC" firstStartedPulling="2026-01-21 14:40:50.470805652 +0000 UTC m=+688.379545584" lastFinishedPulling="2026-01-21 14:40:54.100689681 +0000 UTC m=+692.009429633" observedRunningTime="2026-01-21 14:40:55.216299535 +0000 UTC m=+693.125039497" watchObservedRunningTime="2026-01-21 14:40:55.252741355 +0000 UTC m=+693.161481287" Jan 21 14:40:57 crc kubenswrapper[4720]: I0121 14:40:57.208516 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-j9dxt" event={"ID":"a26c9332-5a74-49a3-8347-45ae67cb1c90","Type":"ContainerStarted","Data":"7aacb0a71bf3d361b9ec556c4329119139f871922ad3541968fe7c9537a421d6"} Jan 21 14:40:57 crc kubenswrapper[4720]: I0121 14:40:57.229065 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-54757c584b-j9dxt" podStartSLOduration=1.6149096680000001 podStartE2EDuration="7.229043317s" podCreationTimestamp="2026-01-21 14:40:50 +0000 UTC" firstStartedPulling="2026-01-21 14:40:50.858639475 +0000 UTC m=+688.767379407" lastFinishedPulling="2026-01-21 14:40:56.472773124 +0000 UTC m=+694.381513056" observedRunningTime="2026-01-21 14:40:57.221414498 +0000 UTC m=+695.130154440" watchObservedRunningTime="2026-01-21 14:40:57.229043317 +0000 UTC m=+695.137783249" Jan 21 14:41:00 crc kubenswrapper[4720]: I0121 14:41:00.432554 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-l74mh" Jan 21 14:41:00 crc kubenswrapper[4720]: I0121 14:41:00.768146 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:41:00 crc kubenswrapper[4720]: I0121 14:41:00.768501 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:41:00 crc kubenswrapper[4720]: I0121 14:41:00.772946 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:41:01 crc kubenswrapper[4720]: I0121 14:41:01.237417 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:41:01 crc kubenswrapper[4720]: I0121 14:41:01.284439 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-42g76"] Jan 21 14:41:11 crc kubenswrapper[4720]: I0121 14:41:11.005690 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-xcckr" Jan 21 14:41:23 crc kubenswrapper[4720]: I0121 14:41:23.650310 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb"] Jan 21 14:41:23 crc kubenswrapper[4720]: I0121 14:41:23.652079 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb" Jan 21 14:41:23 crc kubenswrapper[4720]: I0121 14:41:23.654496 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 21 14:41:23 crc kubenswrapper[4720]: I0121 14:41:23.663610 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb"] Jan 21 14:41:23 crc kubenswrapper[4720]: I0121 14:41:23.743430 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l99w\" (UniqueName: \"kubernetes.io/projected/93611686-cfcc-4f9b-985d-a8e0d9cb7219-kube-api-access-4l99w\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb\" (UID: \"93611686-cfcc-4f9b-985d-a8e0d9cb7219\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb" Jan 21 14:41:23 crc kubenswrapper[4720]: I0121 14:41:23.743521 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/93611686-cfcc-4f9b-985d-a8e0d9cb7219-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb\" (UID: \"93611686-cfcc-4f9b-985d-a8e0d9cb7219\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb" Jan 21 14:41:23 crc kubenswrapper[4720]: I0121 14:41:23.743631 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/93611686-cfcc-4f9b-985d-a8e0d9cb7219-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb\" (UID: \"93611686-cfcc-4f9b-985d-a8e0d9cb7219\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb" Jan 21 14:41:23 crc kubenswrapper[4720]: I0121 14:41:23.846008 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l99w\" (UniqueName: \"kubernetes.io/projected/93611686-cfcc-4f9b-985d-a8e0d9cb7219-kube-api-access-4l99w\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb\" (UID: \"93611686-cfcc-4f9b-985d-a8e0d9cb7219\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb" Jan 21 14:41:23 crc kubenswrapper[4720]: I0121 14:41:23.846407 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/93611686-cfcc-4f9b-985d-a8e0d9cb7219-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb\" (UID: \"93611686-cfcc-4f9b-985d-a8e0d9cb7219\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb" Jan 21 14:41:23 crc kubenswrapper[4720]: I0121 14:41:23.846893 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/93611686-cfcc-4f9b-985d-a8e0d9cb7219-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb\" (UID: \"93611686-cfcc-4f9b-985d-a8e0d9cb7219\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb" Jan 21 14:41:23 crc kubenswrapper[4720]: I0121 14:41:23.846441 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/93611686-cfcc-4f9b-985d-a8e0d9cb7219-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb\" (UID: \"93611686-cfcc-4f9b-985d-a8e0d9cb7219\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb" Jan 21 14:41:23 crc kubenswrapper[4720]: I0121 14:41:23.847026 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/93611686-cfcc-4f9b-985d-a8e0d9cb7219-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb\" (UID: \"93611686-cfcc-4f9b-985d-a8e0d9cb7219\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb" Jan 21 14:41:23 crc kubenswrapper[4720]: I0121 14:41:23.882690 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l99w\" (UniqueName: \"kubernetes.io/projected/93611686-cfcc-4f9b-985d-a8e0d9cb7219-kube-api-access-4l99w\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb\" (UID: \"93611686-cfcc-4f9b-985d-a8e0d9cb7219\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb" Jan 21 14:41:23 crc kubenswrapper[4720]: I0121 14:41:23.968195 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb" Jan 21 14:41:24 crc kubenswrapper[4720]: I0121 14:41:24.373951 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb"] Jan 21 14:41:24 crc kubenswrapper[4720]: W0121 14:41:24.385847 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93611686_cfcc_4f9b_985d_a8e0d9cb7219.slice/crio-6c06759cbf3c34255e046bc9f5a37ffe9f7084c2fa80c769c0aa258efba94488 WatchSource:0}: Error finding container 6c06759cbf3c34255e046bc9f5a37ffe9f7084c2fa80c769c0aa258efba94488: Status 404 returned error can't find the container with id 6c06759cbf3c34255e046bc9f5a37ffe9f7084c2fa80c769c0aa258efba94488 Jan 21 14:41:25 crc kubenswrapper[4720]: I0121 14:41:25.373872 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb" event={"ID":"93611686-cfcc-4f9b-985d-a8e0d9cb7219","Type":"ContainerStarted","Data":"6c06759cbf3c34255e046bc9f5a37ffe9f7084c2fa80c769c0aa258efba94488"} Jan 21 14:41:26 crc kubenswrapper[4720]: I0121 14:41:26.345229 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-42g76" podUID="ac15d591-5558-4df9-b596-a1e27325bd6c" containerName="console" containerID="cri-o://d017a8fc53b7b34bb07b21c6bfe71cb99f83c209c27a0147afc5ce68886d64e1" gracePeriod=15 Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.292883 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-42g76_ac15d591-5558-4df9-b596-a1e27325bd6c/console/0.log" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.293313 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.388794 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-42g76_ac15d591-5558-4df9-b596-a1e27325bd6c/console/0.log" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.388848 4720 generic.go:334] "Generic (PLEG): container finished" podID="ac15d591-5558-4df9-b596-a1e27325bd6c" containerID="d017a8fc53b7b34bb07b21c6bfe71cb99f83c209c27a0147afc5ce68886d64e1" exitCode=2 Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.388905 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-42g76" event={"ID":"ac15d591-5558-4df9-b596-a1e27325bd6c","Type":"ContainerDied","Data":"d017a8fc53b7b34bb07b21c6bfe71cb99f83c209c27a0147afc5ce68886d64e1"} Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.388943 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-42g76" event={"ID":"ac15d591-5558-4df9-b596-a1e27325bd6c","Type":"ContainerDied","Data":"28165debc992515a62bbac33db73e05a5347bebc002b160765e6c1b991bcf92e"} Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.388941 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.389009 4720 scope.go:117] "RemoveContainer" containerID="d017a8fc53b7b34bb07b21c6bfe71cb99f83c209c27a0147afc5ce68886d64e1" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.390685 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-trusted-ca-bundle\") pod \"ac15d591-5558-4df9-b596-a1e27325bd6c\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.390921 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ac15d591-5558-4df9-b596-a1e27325bd6c-console-oauth-config\") pod \"ac15d591-5558-4df9-b596-a1e27325bd6c\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.392867 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-service-ca\") pod \"ac15d591-5558-4df9-b596-a1e27325bd6c\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.392926 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-console-config\") pod \"ac15d591-5558-4df9-b596-a1e27325bd6c\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.392972 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzt2f\" (UniqueName: \"kubernetes.io/projected/ac15d591-5558-4df9-b596-a1e27325bd6c-kube-api-access-nzt2f\") pod \"ac15d591-5558-4df9-b596-a1e27325bd6c\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.393003 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac15d591-5558-4df9-b596-a1e27325bd6c-console-serving-cert\") pod \"ac15d591-5558-4df9-b596-a1e27325bd6c\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.393051 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-oauth-serving-cert\") pod \"ac15d591-5558-4df9-b596-a1e27325bd6c\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.391542 4720 generic.go:334] "Generic (PLEG): container finished" podID="93611686-cfcc-4f9b-985d-a8e0d9cb7219" containerID="d317812bc13480a9a7c6599235163fbdc34765b9104211917cffb37875fd7f5c" exitCode=0 Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.391576 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb" event={"ID":"93611686-cfcc-4f9b-985d-a8e0d9cb7219","Type":"ContainerDied","Data":"d317812bc13480a9a7c6599235163fbdc34765b9104211917cffb37875fd7f5c"} Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.391876 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ac15d591-5558-4df9-b596-a1e27325bd6c" (UID: "ac15d591-5558-4df9-b596-a1e27325bd6c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.393948 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-console-config" (OuterVolumeSpecName: "console-config") pod "ac15d591-5558-4df9-b596-a1e27325bd6c" (UID: "ac15d591-5558-4df9-b596-a1e27325bd6c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.394522 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-service-ca" (OuterVolumeSpecName: "service-ca") pod "ac15d591-5558-4df9-b596-a1e27325bd6c" (UID: "ac15d591-5558-4df9-b596-a1e27325bd6c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.396192 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ac15d591-5558-4df9-b596-a1e27325bd6c" (UID: "ac15d591-5558-4df9-b596-a1e27325bd6c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.401148 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac15d591-5558-4df9-b596-a1e27325bd6c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ac15d591-5558-4df9-b596-a1e27325bd6c" (UID: "ac15d591-5558-4df9-b596-a1e27325bd6c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.407103 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac15d591-5558-4df9-b596-a1e27325bd6c-kube-api-access-nzt2f" (OuterVolumeSpecName: "kube-api-access-nzt2f") pod "ac15d591-5558-4df9-b596-a1e27325bd6c" (UID: "ac15d591-5558-4df9-b596-a1e27325bd6c"). InnerVolumeSpecName "kube-api-access-nzt2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.408192 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac15d591-5558-4df9-b596-a1e27325bd6c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ac15d591-5558-4df9-b596-a1e27325bd6c" (UID: "ac15d591-5558-4df9-b596-a1e27325bd6c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.460807 4720 scope.go:117] "RemoveContainer" containerID="d017a8fc53b7b34bb07b21c6bfe71cb99f83c209c27a0147afc5ce68886d64e1" Jan 21 14:41:27 crc kubenswrapper[4720]: E0121 14:41:27.461247 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d017a8fc53b7b34bb07b21c6bfe71cb99f83c209c27a0147afc5ce68886d64e1\": container with ID starting with d017a8fc53b7b34bb07b21c6bfe71cb99f83c209c27a0147afc5ce68886d64e1 not found: ID does not exist" containerID="d017a8fc53b7b34bb07b21c6bfe71cb99f83c209c27a0147afc5ce68886d64e1" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.461342 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d017a8fc53b7b34bb07b21c6bfe71cb99f83c209c27a0147afc5ce68886d64e1"} err="failed to get container status \"d017a8fc53b7b34bb07b21c6bfe71cb99f83c209c27a0147afc5ce68886d64e1\": rpc error: code = NotFound desc = could not find container \"d017a8fc53b7b34bb07b21c6bfe71cb99f83c209c27a0147afc5ce68886d64e1\": container with ID starting with d017a8fc53b7b34bb07b21c6bfe71cb99f83c209c27a0147afc5ce68886d64e1 not found: ID does not exist" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.494077 4720 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.494100 4720 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ac15d591-5558-4df9-b596-a1e27325bd6c-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.494110 4720 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-service-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.494118 4720 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-console-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.494126 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzt2f\" (UniqueName: \"kubernetes.io/projected/ac15d591-5558-4df9-b596-a1e27325bd6c-kube-api-access-nzt2f\") on node \"crc\" DevicePath \"\"" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.494134 4720 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac15d591-5558-4df9-b596-a1e27325bd6c-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.494141 4720 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.715326 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-42g76"] Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.719785 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-42g76"] Jan 21 14:41:28 crc kubenswrapper[4720]: I0121 14:41:28.169002 4720 patch_prober.go:28] interesting pod/console-f9d7485db-42g76 container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 21 14:41:28 crc kubenswrapper[4720]: I0121 14:41:28.169066 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-f9d7485db-42g76" podUID="ac15d591-5558-4df9-b596-a1e27325bd6c" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 21 14:41:28 crc kubenswrapper[4720]: I0121 14:41:28.685963 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac15d591-5558-4df9-b596-a1e27325bd6c" path="/var/lib/kubelet/pods/ac15d591-5558-4df9-b596-a1e27325bd6c/volumes" Jan 21 14:41:29 crc kubenswrapper[4720]: I0121 14:41:29.416236 4720 generic.go:334] "Generic (PLEG): container finished" podID="93611686-cfcc-4f9b-985d-a8e0d9cb7219" containerID="7c6456fe74570fb8e91805cb05b2f00cdcee03d21bd6b62636db15e2ddc2cacc" exitCode=0 Jan 21 14:41:29 crc kubenswrapper[4720]: I0121 14:41:29.416298 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb" event={"ID":"93611686-cfcc-4f9b-985d-a8e0d9cb7219","Type":"ContainerDied","Data":"7c6456fe74570fb8e91805cb05b2f00cdcee03d21bd6b62636db15e2ddc2cacc"} Jan 21 14:41:30 crc kubenswrapper[4720]: I0121 14:41:30.423355 4720 generic.go:334] "Generic (PLEG): container finished" podID="93611686-cfcc-4f9b-985d-a8e0d9cb7219" containerID="b1513ae31d6d95b27359f88b9e6cce6e2d01bfa32ab6bb5a175814a2e3252d12" exitCode=0 Jan 21 14:41:30 crc kubenswrapper[4720]: I0121 14:41:30.423475 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb" event={"ID":"93611686-cfcc-4f9b-985d-a8e0d9cb7219","Type":"ContainerDied","Data":"b1513ae31d6d95b27359f88b9e6cce6e2d01bfa32ab6bb5a175814a2e3252d12"} Jan 21 14:41:31 crc kubenswrapper[4720]: I0121 14:41:31.720865 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb" Jan 21 14:41:31 crc kubenswrapper[4720]: I0121 14:41:31.750291 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/93611686-cfcc-4f9b-985d-a8e0d9cb7219-bundle\") pod \"93611686-cfcc-4f9b-985d-a8e0d9cb7219\" (UID: \"93611686-cfcc-4f9b-985d-a8e0d9cb7219\") " Jan 21 14:41:31 crc kubenswrapper[4720]: I0121 14:41:31.750408 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/93611686-cfcc-4f9b-985d-a8e0d9cb7219-util\") pod \"93611686-cfcc-4f9b-985d-a8e0d9cb7219\" (UID: \"93611686-cfcc-4f9b-985d-a8e0d9cb7219\") " Jan 21 14:41:31 crc kubenswrapper[4720]: I0121 14:41:31.750450 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4l99w\" (UniqueName: \"kubernetes.io/projected/93611686-cfcc-4f9b-985d-a8e0d9cb7219-kube-api-access-4l99w\") pod \"93611686-cfcc-4f9b-985d-a8e0d9cb7219\" (UID: \"93611686-cfcc-4f9b-985d-a8e0d9cb7219\") " Jan 21 14:41:31 crc kubenswrapper[4720]: I0121 14:41:31.753343 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93611686-cfcc-4f9b-985d-a8e0d9cb7219-bundle" (OuterVolumeSpecName: "bundle") pod "93611686-cfcc-4f9b-985d-a8e0d9cb7219" (UID: "93611686-cfcc-4f9b-985d-a8e0d9cb7219"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:41:31 crc kubenswrapper[4720]: I0121 14:41:31.757389 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93611686-cfcc-4f9b-985d-a8e0d9cb7219-kube-api-access-4l99w" (OuterVolumeSpecName: "kube-api-access-4l99w") pod "93611686-cfcc-4f9b-985d-a8e0d9cb7219" (UID: "93611686-cfcc-4f9b-985d-a8e0d9cb7219"). InnerVolumeSpecName "kube-api-access-4l99w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:41:31 crc kubenswrapper[4720]: I0121 14:41:31.770851 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93611686-cfcc-4f9b-985d-a8e0d9cb7219-util" (OuterVolumeSpecName: "util") pod "93611686-cfcc-4f9b-985d-a8e0d9cb7219" (UID: "93611686-cfcc-4f9b-985d-a8e0d9cb7219"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:41:31 crc kubenswrapper[4720]: I0121 14:41:31.852087 4720 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/93611686-cfcc-4f9b-985d-a8e0d9cb7219-util\") on node \"crc\" DevicePath \"\"" Jan 21 14:41:31 crc kubenswrapper[4720]: I0121 14:41:31.852129 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4l99w\" (UniqueName: \"kubernetes.io/projected/93611686-cfcc-4f9b-985d-a8e0d9cb7219-kube-api-access-4l99w\") on node \"crc\" DevicePath \"\"" Jan 21 14:41:31 crc kubenswrapper[4720]: I0121 14:41:31.852140 4720 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/93611686-cfcc-4f9b-985d-a8e0d9cb7219-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:41:32 crc kubenswrapper[4720]: I0121 14:41:32.435499 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb" event={"ID":"93611686-cfcc-4f9b-985d-a8e0d9cb7219","Type":"ContainerDied","Data":"6c06759cbf3c34255e046bc9f5a37ffe9f7084c2fa80c769c0aa258efba94488"} Jan 21 14:41:32 crc kubenswrapper[4720]: I0121 14:41:32.435539 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c06759cbf3c34255e046bc9f5a37ffe9f7084c2fa80c769c0aa258efba94488" Jan 21 14:41:32 crc kubenswrapper[4720]: I0121 14:41:32.435542 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb" Jan 21 14:41:39 crc kubenswrapper[4720]: I0121 14:41:39.402893 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-s476k"] Jan 21 14:41:39 crc kubenswrapper[4720]: E0121 14:41:39.403680 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93611686-cfcc-4f9b-985d-a8e0d9cb7219" containerName="pull" Jan 21 14:41:39 crc kubenswrapper[4720]: I0121 14:41:39.403695 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="93611686-cfcc-4f9b-985d-a8e0d9cb7219" containerName="pull" Jan 21 14:41:39 crc kubenswrapper[4720]: E0121 14:41:39.403709 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93611686-cfcc-4f9b-985d-a8e0d9cb7219" containerName="util" Jan 21 14:41:39 crc kubenswrapper[4720]: I0121 14:41:39.403717 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="93611686-cfcc-4f9b-985d-a8e0d9cb7219" containerName="util" Jan 21 14:41:39 crc kubenswrapper[4720]: E0121 14:41:39.403735 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac15d591-5558-4df9-b596-a1e27325bd6c" containerName="console" Jan 21 14:41:39 crc kubenswrapper[4720]: I0121 14:41:39.403743 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac15d591-5558-4df9-b596-a1e27325bd6c" containerName="console" Jan 21 14:41:39 crc kubenswrapper[4720]: E0121 14:41:39.403751 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93611686-cfcc-4f9b-985d-a8e0d9cb7219" containerName="extract" Jan 21 14:41:39 crc kubenswrapper[4720]: I0121 14:41:39.403759 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="93611686-cfcc-4f9b-985d-a8e0d9cb7219" containerName="extract" Jan 21 14:41:39 crc kubenswrapper[4720]: I0121 14:41:39.403879 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="93611686-cfcc-4f9b-985d-a8e0d9cb7219" containerName="extract" Jan 21 14:41:39 crc kubenswrapper[4720]: I0121 14:41:39.403895 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac15d591-5558-4df9-b596-a1e27325bd6c" containerName="console" Jan 21 14:41:39 crc kubenswrapper[4720]: I0121 14:41:39.404820 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s476k" Jan 21 14:41:39 crc kubenswrapper[4720]: I0121 14:41:39.426822 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s476k"] Jan 21 14:41:39 crc kubenswrapper[4720]: I0121 14:41:39.455207 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/488fea59-5b8b-41f0-82c4-e148ffe21d66-utilities\") pod \"community-operators-s476k\" (UID: \"488fea59-5b8b-41f0-82c4-e148ffe21d66\") " pod="openshift-marketplace/community-operators-s476k" Jan 21 14:41:39 crc kubenswrapper[4720]: I0121 14:41:39.455260 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmhc9\" (UniqueName: \"kubernetes.io/projected/488fea59-5b8b-41f0-82c4-e148ffe21d66-kube-api-access-gmhc9\") pod \"community-operators-s476k\" (UID: \"488fea59-5b8b-41f0-82c4-e148ffe21d66\") " pod="openshift-marketplace/community-operators-s476k" Jan 21 14:41:39 crc kubenswrapper[4720]: I0121 14:41:39.455293 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/488fea59-5b8b-41f0-82c4-e148ffe21d66-catalog-content\") pod \"community-operators-s476k\" (UID: \"488fea59-5b8b-41f0-82c4-e148ffe21d66\") " pod="openshift-marketplace/community-operators-s476k" Jan 21 14:41:39 crc kubenswrapper[4720]: I0121 14:41:39.556970 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/488fea59-5b8b-41f0-82c4-e148ffe21d66-utilities\") pod \"community-operators-s476k\" (UID: \"488fea59-5b8b-41f0-82c4-e148ffe21d66\") " pod="openshift-marketplace/community-operators-s476k" Jan 21 14:41:39 crc kubenswrapper[4720]: I0121 14:41:39.557026 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmhc9\" (UniqueName: \"kubernetes.io/projected/488fea59-5b8b-41f0-82c4-e148ffe21d66-kube-api-access-gmhc9\") pod \"community-operators-s476k\" (UID: \"488fea59-5b8b-41f0-82c4-e148ffe21d66\") " pod="openshift-marketplace/community-operators-s476k" Jan 21 14:41:39 crc kubenswrapper[4720]: I0121 14:41:39.557054 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/488fea59-5b8b-41f0-82c4-e148ffe21d66-catalog-content\") pod \"community-operators-s476k\" (UID: \"488fea59-5b8b-41f0-82c4-e148ffe21d66\") " pod="openshift-marketplace/community-operators-s476k" Jan 21 14:41:39 crc kubenswrapper[4720]: I0121 14:41:39.557572 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/488fea59-5b8b-41f0-82c4-e148ffe21d66-utilities\") pod \"community-operators-s476k\" (UID: \"488fea59-5b8b-41f0-82c4-e148ffe21d66\") " pod="openshift-marketplace/community-operators-s476k" Jan 21 14:41:39 crc kubenswrapper[4720]: I0121 14:41:39.557615 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/488fea59-5b8b-41f0-82c4-e148ffe21d66-catalog-content\") pod \"community-operators-s476k\" (UID: \"488fea59-5b8b-41f0-82c4-e148ffe21d66\") " pod="openshift-marketplace/community-operators-s476k" Jan 21 14:41:39 crc kubenswrapper[4720]: I0121 14:41:39.584585 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmhc9\" (UniqueName: \"kubernetes.io/projected/488fea59-5b8b-41f0-82c4-e148ffe21d66-kube-api-access-gmhc9\") pod \"community-operators-s476k\" (UID: \"488fea59-5b8b-41f0-82c4-e148ffe21d66\") " pod="openshift-marketplace/community-operators-s476k" Jan 21 14:41:39 crc kubenswrapper[4720]: I0121 14:41:39.721082 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s476k" Jan 21 14:41:40 crc kubenswrapper[4720]: I0121 14:41:40.138294 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s476k"] Jan 21 14:41:40 crc kubenswrapper[4720]: I0121 14:41:40.482036 4720 generic.go:334] "Generic (PLEG): container finished" podID="488fea59-5b8b-41f0-82c4-e148ffe21d66" containerID="c75f1a265d59a3285caa8248a4c804dda31a451dcadbae4bb1862a14414777d1" exitCode=0 Jan 21 14:41:40 crc kubenswrapper[4720]: I0121 14:41:40.482127 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s476k" event={"ID":"488fea59-5b8b-41f0-82c4-e148ffe21d66","Type":"ContainerDied","Data":"c75f1a265d59a3285caa8248a4c804dda31a451dcadbae4bb1862a14414777d1"} Jan 21 14:41:40 crc kubenswrapper[4720]: I0121 14:41:40.482271 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s476k" event={"ID":"488fea59-5b8b-41f0-82c4-e148ffe21d66","Type":"ContainerStarted","Data":"501dc062329c9af7f8d1683a77c7040d8ab41ee73e94a74219700bf01c887a58"} Jan 21 14:41:41 crc kubenswrapper[4720]: I0121 14:41:41.489474 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s476k" event={"ID":"488fea59-5b8b-41f0-82c4-e148ffe21d66","Type":"ContainerStarted","Data":"35ae302a057d3c43620b7d12dcea42a48331849da309b9a105b1e699591e8bae"} Jan 21 14:41:41 crc kubenswrapper[4720]: I0121 14:41:41.752396 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7b8c8cff46-cbv67"] Jan 21 14:41:41 crc kubenswrapper[4720]: I0121 14:41:41.753485 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7b8c8cff46-cbv67" Jan 21 14:41:41 crc kubenswrapper[4720]: I0121 14:41:41.757688 4720 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Jan 21 14:41:41 crc kubenswrapper[4720]: I0121 14:41:41.757929 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Jan 21 14:41:41 crc kubenswrapper[4720]: I0121 14:41:41.761078 4720 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-dhnxl" Jan 21 14:41:41 crc kubenswrapper[4720]: I0121 14:41:41.761444 4720 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Jan 21 14:41:41 crc kubenswrapper[4720]: I0121 14:41:41.761960 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Jan 21 14:41:41 crc kubenswrapper[4720]: I0121 14:41:41.778915 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7b8c8cff46-cbv67"] Jan 21 14:41:41 crc kubenswrapper[4720]: I0121 14:41:41.785917 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b6fdd799-fe82-4cd7-b825-c755b6189180-apiservice-cert\") pod \"metallb-operator-controller-manager-7b8c8cff46-cbv67\" (UID: \"b6fdd799-fe82-4cd7-b825-c755b6189180\") " pod="metallb-system/metallb-operator-controller-manager-7b8c8cff46-cbv67" Jan 21 14:41:41 crc kubenswrapper[4720]: I0121 14:41:41.785997 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mdt2\" (UniqueName: \"kubernetes.io/projected/b6fdd799-fe82-4cd7-b825-c755b6189180-kube-api-access-7mdt2\") pod \"metallb-operator-controller-manager-7b8c8cff46-cbv67\" (UID: \"b6fdd799-fe82-4cd7-b825-c755b6189180\") " pod="metallb-system/metallb-operator-controller-manager-7b8c8cff46-cbv67" Jan 21 14:41:41 crc kubenswrapper[4720]: I0121 14:41:41.786019 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b6fdd799-fe82-4cd7-b825-c755b6189180-webhook-cert\") pod \"metallb-operator-controller-manager-7b8c8cff46-cbv67\" (UID: \"b6fdd799-fe82-4cd7-b825-c755b6189180\") " pod="metallb-system/metallb-operator-controller-manager-7b8c8cff46-cbv67" Jan 21 14:41:41 crc kubenswrapper[4720]: I0121 14:41:41.887044 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mdt2\" (UniqueName: \"kubernetes.io/projected/b6fdd799-fe82-4cd7-b825-c755b6189180-kube-api-access-7mdt2\") pod \"metallb-operator-controller-manager-7b8c8cff46-cbv67\" (UID: \"b6fdd799-fe82-4cd7-b825-c755b6189180\") " pod="metallb-system/metallb-operator-controller-manager-7b8c8cff46-cbv67" Jan 21 14:41:41 crc kubenswrapper[4720]: I0121 14:41:41.887084 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b6fdd799-fe82-4cd7-b825-c755b6189180-webhook-cert\") pod \"metallb-operator-controller-manager-7b8c8cff46-cbv67\" (UID: \"b6fdd799-fe82-4cd7-b825-c755b6189180\") " pod="metallb-system/metallb-operator-controller-manager-7b8c8cff46-cbv67" Jan 21 14:41:41 crc kubenswrapper[4720]: I0121 14:41:41.887131 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b6fdd799-fe82-4cd7-b825-c755b6189180-apiservice-cert\") pod \"metallb-operator-controller-manager-7b8c8cff46-cbv67\" (UID: \"b6fdd799-fe82-4cd7-b825-c755b6189180\") " pod="metallb-system/metallb-operator-controller-manager-7b8c8cff46-cbv67" Jan 21 14:41:41 crc kubenswrapper[4720]: I0121 14:41:41.893231 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b6fdd799-fe82-4cd7-b825-c755b6189180-webhook-cert\") pod \"metallb-operator-controller-manager-7b8c8cff46-cbv67\" (UID: \"b6fdd799-fe82-4cd7-b825-c755b6189180\") " pod="metallb-system/metallb-operator-controller-manager-7b8c8cff46-cbv67" Jan 21 14:41:41 crc kubenswrapper[4720]: I0121 14:41:41.894157 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b6fdd799-fe82-4cd7-b825-c755b6189180-apiservice-cert\") pod \"metallb-operator-controller-manager-7b8c8cff46-cbv67\" (UID: \"b6fdd799-fe82-4cd7-b825-c755b6189180\") " pod="metallb-system/metallb-operator-controller-manager-7b8c8cff46-cbv67" Jan 21 14:41:41 crc kubenswrapper[4720]: I0121 14:41:41.918437 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mdt2\" (UniqueName: \"kubernetes.io/projected/b6fdd799-fe82-4cd7-b825-c755b6189180-kube-api-access-7mdt2\") pod \"metallb-operator-controller-manager-7b8c8cff46-cbv67\" (UID: \"b6fdd799-fe82-4cd7-b825-c755b6189180\") " pod="metallb-system/metallb-operator-controller-manager-7b8c8cff46-cbv67" Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.067111 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7b8c8cff46-cbv67" Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.108409 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-75df998c5f-tnbdz"] Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.109516 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-75df998c5f-tnbdz" Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.120314 4720 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.120839 4720 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.121145 4720 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-4bkg6" Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.133318 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-75df998c5f-tnbdz"] Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.190360 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6c334ce5-b6c7-40c8-a261-5a5084ae3db8-webhook-cert\") pod \"metallb-operator-webhook-server-75df998c5f-tnbdz\" (UID: \"6c334ce5-b6c7-40c8-a261-5a5084ae3db8\") " pod="metallb-system/metallb-operator-webhook-server-75df998c5f-tnbdz" Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.190410 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6c334ce5-b6c7-40c8-a261-5a5084ae3db8-apiservice-cert\") pod \"metallb-operator-webhook-server-75df998c5f-tnbdz\" (UID: \"6c334ce5-b6c7-40c8-a261-5a5084ae3db8\") " pod="metallb-system/metallb-operator-webhook-server-75df998c5f-tnbdz" Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.190452 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhw7n\" (UniqueName: \"kubernetes.io/projected/6c334ce5-b6c7-40c8-a261-5a5084ae3db8-kube-api-access-vhw7n\") pod \"metallb-operator-webhook-server-75df998c5f-tnbdz\" (UID: \"6c334ce5-b6c7-40c8-a261-5a5084ae3db8\") " pod="metallb-system/metallb-operator-webhook-server-75df998c5f-tnbdz" Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.291295 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6c334ce5-b6c7-40c8-a261-5a5084ae3db8-webhook-cert\") pod \"metallb-operator-webhook-server-75df998c5f-tnbdz\" (UID: \"6c334ce5-b6c7-40c8-a261-5a5084ae3db8\") " pod="metallb-system/metallb-operator-webhook-server-75df998c5f-tnbdz" Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.291641 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6c334ce5-b6c7-40c8-a261-5a5084ae3db8-apiservice-cert\") pod \"metallb-operator-webhook-server-75df998c5f-tnbdz\" (UID: \"6c334ce5-b6c7-40c8-a261-5a5084ae3db8\") " pod="metallb-system/metallb-operator-webhook-server-75df998c5f-tnbdz" Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.292350 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhw7n\" (UniqueName: \"kubernetes.io/projected/6c334ce5-b6c7-40c8-a261-5a5084ae3db8-kube-api-access-vhw7n\") pod \"metallb-operator-webhook-server-75df998c5f-tnbdz\" (UID: \"6c334ce5-b6c7-40c8-a261-5a5084ae3db8\") " pod="metallb-system/metallb-operator-webhook-server-75df998c5f-tnbdz" Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.308481 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6c334ce5-b6c7-40c8-a261-5a5084ae3db8-apiservice-cert\") pod \"metallb-operator-webhook-server-75df998c5f-tnbdz\" (UID: \"6c334ce5-b6c7-40c8-a261-5a5084ae3db8\") " pod="metallb-system/metallb-operator-webhook-server-75df998c5f-tnbdz" Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.312435 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6c334ce5-b6c7-40c8-a261-5a5084ae3db8-webhook-cert\") pod \"metallb-operator-webhook-server-75df998c5f-tnbdz\" (UID: \"6c334ce5-b6c7-40c8-a261-5a5084ae3db8\") " pod="metallb-system/metallb-operator-webhook-server-75df998c5f-tnbdz" Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.344556 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhw7n\" (UniqueName: \"kubernetes.io/projected/6c334ce5-b6c7-40c8-a261-5a5084ae3db8-kube-api-access-vhw7n\") pod \"metallb-operator-webhook-server-75df998c5f-tnbdz\" (UID: \"6c334ce5-b6c7-40c8-a261-5a5084ae3db8\") " pod="metallb-system/metallb-operator-webhook-server-75df998c5f-tnbdz" Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.431983 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-75df998c5f-tnbdz" Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.497876 4720 generic.go:334] "Generic (PLEG): container finished" podID="488fea59-5b8b-41f0-82c4-e148ffe21d66" containerID="35ae302a057d3c43620b7d12dcea42a48331849da309b9a105b1e699591e8bae" exitCode=0 Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.497925 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s476k" event={"ID":"488fea59-5b8b-41f0-82c4-e148ffe21d66","Type":"ContainerDied","Data":"35ae302a057d3c43620b7d12dcea42a48331849da309b9a105b1e699591e8bae"} Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.760987 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7b8c8cff46-cbv67"] Jan 21 14:41:42 crc kubenswrapper[4720]: W0121 14:41:42.769039 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6fdd799_fe82_4cd7_b825_c755b6189180.slice/crio-8180a71a8cb386db3be4db1912caf2db0f8136bfc9916b47624320e599eb4158 WatchSource:0}: Error finding container 8180a71a8cb386db3be4db1912caf2db0f8136bfc9916b47624320e599eb4158: Status 404 returned error can't find the container with id 8180a71a8cb386db3be4db1912caf2db0f8136bfc9916b47624320e599eb4158 Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.788096 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-75df998c5f-tnbdz"] Jan 21 14:41:42 crc kubenswrapper[4720]: W0121 14:41:42.795499 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c334ce5_b6c7_40c8_a261_5a5084ae3db8.slice/crio-ddafffc1fc78333e9df9e6ed6a51bed5e45a5f0e0d71f5ae024b45bd1b87a6d1 WatchSource:0}: Error finding container ddafffc1fc78333e9df9e6ed6a51bed5e45a5f0e0d71f5ae024b45bd1b87a6d1: Status 404 returned error can't find the container with id ddafffc1fc78333e9df9e6ed6a51bed5e45a5f0e0d71f5ae024b45bd1b87a6d1 Jan 21 14:41:43 crc kubenswrapper[4720]: I0121 14:41:43.505363 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7b8c8cff46-cbv67" event={"ID":"b6fdd799-fe82-4cd7-b825-c755b6189180","Type":"ContainerStarted","Data":"8180a71a8cb386db3be4db1912caf2db0f8136bfc9916b47624320e599eb4158"} Jan 21 14:41:43 crc kubenswrapper[4720]: I0121 14:41:43.509160 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s476k" event={"ID":"488fea59-5b8b-41f0-82c4-e148ffe21d66","Type":"ContainerStarted","Data":"68a5b5e50b102ff4372ff39c20b1802b9ea9e1447a4d35e5ea702d07eb3dd7c9"} Jan 21 14:41:43 crc kubenswrapper[4720]: I0121 14:41:43.510430 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-75df998c5f-tnbdz" event={"ID":"6c334ce5-b6c7-40c8-a261-5a5084ae3db8","Type":"ContainerStarted","Data":"ddafffc1fc78333e9df9e6ed6a51bed5e45a5f0e0d71f5ae024b45bd1b87a6d1"} Jan 21 14:41:43 crc kubenswrapper[4720]: I0121 14:41:43.530539 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-s476k" podStartSLOduration=2.070362448 podStartE2EDuration="4.530519021s" podCreationTimestamp="2026-01-21 14:41:39 +0000 UTC" firstStartedPulling="2026-01-21 14:41:40.483311944 +0000 UTC m=+738.392051876" lastFinishedPulling="2026-01-21 14:41:42.943468517 +0000 UTC m=+740.852208449" observedRunningTime="2026-01-21 14:41:43.525643159 +0000 UTC m=+741.434383101" watchObservedRunningTime="2026-01-21 14:41:43.530519021 +0000 UTC m=+741.439258953" Jan 21 14:41:49 crc kubenswrapper[4720]: I0121 14:41:49.553309 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-75df998c5f-tnbdz" event={"ID":"6c334ce5-b6c7-40c8-a261-5a5084ae3db8","Type":"ContainerStarted","Data":"60e8f62f1c6586e4983f90dd3e72d9a9553f94285ab954fb178b144a36b88655"} Jan 21 14:41:49 crc kubenswrapper[4720]: I0121 14:41:49.553842 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-75df998c5f-tnbdz" Jan 21 14:41:49 crc kubenswrapper[4720]: I0121 14:41:49.554573 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7b8c8cff46-cbv67" event={"ID":"b6fdd799-fe82-4cd7-b825-c755b6189180","Type":"ContainerStarted","Data":"3bc2cb9b972e30d0f4c9ba67b9f3df87a323c8fc889fe67ba255fdf3a5d02197"} Jan 21 14:41:49 crc kubenswrapper[4720]: I0121 14:41:49.554799 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7b8c8cff46-cbv67" Jan 21 14:41:49 crc kubenswrapper[4720]: I0121 14:41:49.578209 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-75df998c5f-tnbdz" podStartSLOduration=1.304114934 podStartE2EDuration="7.578193621s" podCreationTimestamp="2026-01-21 14:41:42 +0000 UTC" firstStartedPulling="2026-01-21 14:41:42.800080454 +0000 UTC m=+740.708820386" lastFinishedPulling="2026-01-21 14:41:49.074159141 +0000 UTC m=+746.982899073" observedRunningTime="2026-01-21 14:41:49.576960529 +0000 UTC m=+747.485700471" watchObservedRunningTime="2026-01-21 14:41:49.578193621 +0000 UTC m=+747.486933553" Jan 21 14:41:49 crc kubenswrapper[4720]: I0121 14:41:49.603259 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7b8c8cff46-cbv67" podStartSLOduration=2.382912654 podStartE2EDuration="8.603242195s" podCreationTimestamp="2026-01-21 14:41:41 +0000 UTC" firstStartedPulling="2026-01-21 14:41:42.774061605 +0000 UTC m=+740.682801537" lastFinishedPulling="2026-01-21 14:41:48.994391156 +0000 UTC m=+746.903131078" observedRunningTime="2026-01-21 14:41:49.597306293 +0000 UTC m=+747.506046235" watchObservedRunningTime="2026-01-21 14:41:49.603242195 +0000 UTC m=+747.511982127" Jan 21 14:41:49 crc kubenswrapper[4720]: I0121 14:41:49.722054 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-s476k" Jan 21 14:41:49 crc kubenswrapper[4720]: I0121 14:41:49.722106 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-s476k" Jan 21 14:41:49 crc kubenswrapper[4720]: I0121 14:41:49.765691 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-s476k" Jan 21 14:41:50 crc kubenswrapper[4720]: I0121 14:41:50.605138 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-s476k" Jan 21 14:41:50 crc kubenswrapper[4720]: I0121 14:41:50.648785 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s476k"] Jan 21 14:41:52 crc kubenswrapper[4720]: I0121 14:41:52.570314 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-s476k" podUID="488fea59-5b8b-41f0-82c4-e148ffe21d66" containerName="registry-server" containerID="cri-o://68a5b5e50b102ff4372ff39c20b1802b9ea9e1447a4d35e5ea702d07eb3dd7c9" gracePeriod=2 Jan 21 14:41:53 crc kubenswrapper[4720]: I0121 14:41:53.576072 4720 generic.go:334] "Generic (PLEG): container finished" podID="488fea59-5b8b-41f0-82c4-e148ffe21d66" containerID="68a5b5e50b102ff4372ff39c20b1802b9ea9e1447a4d35e5ea702d07eb3dd7c9" exitCode=0 Jan 21 14:41:53 crc kubenswrapper[4720]: I0121 14:41:53.576218 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s476k" event={"ID":"488fea59-5b8b-41f0-82c4-e148ffe21d66","Type":"ContainerDied","Data":"68a5b5e50b102ff4372ff39c20b1802b9ea9e1447a4d35e5ea702d07eb3dd7c9"} Jan 21 14:41:54 crc kubenswrapper[4720]: I0121 14:41:54.136715 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s476k" Jan 21 14:41:54 crc kubenswrapper[4720]: I0121 14:41:54.285827 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/488fea59-5b8b-41f0-82c4-e148ffe21d66-catalog-content\") pod \"488fea59-5b8b-41f0-82c4-e148ffe21d66\" (UID: \"488fea59-5b8b-41f0-82c4-e148ffe21d66\") " Jan 21 14:41:54 crc kubenswrapper[4720]: I0121 14:41:54.285877 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmhc9\" (UniqueName: \"kubernetes.io/projected/488fea59-5b8b-41f0-82c4-e148ffe21d66-kube-api-access-gmhc9\") pod \"488fea59-5b8b-41f0-82c4-e148ffe21d66\" (UID: \"488fea59-5b8b-41f0-82c4-e148ffe21d66\") " Jan 21 14:41:54 crc kubenswrapper[4720]: I0121 14:41:54.285908 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/488fea59-5b8b-41f0-82c4-e148ffe21d66-utilities\") pod \"488fea59-5b8b-41f0-82c4-e148ffe21d66\" (UID: \"488fea59-5b8b-41f0-82c4-e148ffe21d66\") " Jan 21 14:41:54 crc kubenswrapper[4720]: I0121 14:41:54.286859 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/488fea59-5b8b-41f0-82c4-e148ffe21d66-utilities" (OuterVolumeSpecName: "utilities") pod "488fea59-5b8b-41f0-82c4-e148ffe21d66" (UID: "488fea59-5b8b-41f0-82c4-e148ffe21d66"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:41:54 crc kubenswrapper[4720]: I0121 14:41:54.292864 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/488fea59-5b8b-41f0-82c4-e148ffe21d66-kube-api-access-gmhc9" (OuterVolumeSpecName: "kube-api-access-gmhc9") pod "488fea59-5b8b-41f0-82c4-e148ffe21d66" (UID: "488fea59-5b8b-41f0-82c4-e148ffe21d66"). InnerVolumeSpecName "kube-api-access-gmhc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:41:54 crc kubenswrapper[4720]: I0121 14:41:54.342008 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/488fea59-5b8b-41f0-82c4-e148ffe21d66-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "488fea59-5b8b-41f0-82c4-e148ffe21d66" (UID: "488fea59-5b8b-41f0-82c4-e148ffe21d66"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:41:54 crc kubenswrapper[4720]: I0121 14:41:54.387237 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmhc9\" (UniqueName: \"kubernetes.io/projected/488fea59-5b8b-41f0-82c4-e148ffe21d66-kube-api-access-gmhc9\") on node \"crc\" DevicePath \"\"" Jan 21 14:41:54 crc kubenswrapper[4720]: I0121 14:41:54.387275 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/488fea59-5b8b-41f0-82c4-e148ffe21d66-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 14:41:54 crc kubenswrapper[4720]: I0121 14:41:54.387286 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/488fea59-5b8b-41f0-82c4-e148ffe21d66-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 14:41:54 crc kubenswrapper[4720]: I0121 14:41:54.582561 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s476k" event={"ID":"488fea59-5b8b-41f0-82c4-e148ffe21d66","Type":"ContainerDied","Data":"501dc062329c9af7f8d1683a77c7040d8ab41ee73e94a74219700bf01c887a58"} Jan 21 14:41:54 crc kubenswrapper[4720]: I0121 14:41:54.582615 4720 scope.go:117] "RemoveContainer" containerID="68a5b5e50b102ff4372ff39c20b1802b9ea9e1447a4d35e5ea702d07eb3dd7c9" Jan 21 14:41:54 crc kubenswrapper[4720]: I0121 14:41:54.582622 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s476k" Jan 21 14:41:54 crc kubenswrapper[4720]: I0121 14:41:54.611629 4720 scope.go:117] "RemoveContainer" containerID="35ae302a057d3c43620b7d12dcea42a48331849da309b9a105b1e699591e8bae" Jan 21 14:41:54 crc kubenswrapper[4720]: I0121 14:41:54.630994 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s476k"] Jan 21 14:41:54 crc kubenswrapper[4720]: I0121 14:41:54.655300 4720 scope.go:117] "RemoveContainer" containerID="c75f1a265d59a3285caa8248a4c804dda31a451dcadbae4bb1862a14414777d1" Jan 21 14:41:54 crc kubenswrapper[4720]: I0121 14:41:54.657107 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-s476k"] Jan 21 14:41:54 crc kubenswrapper[4720]: I0121 14:41:54.683793 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="488fea59-5b8b-41f0-82c4-e148ffe21d66" path="/var/lib/kubelet/pods/488fea59-5b8b-41f0-82c4-e148ffe21d66/volumes" Jan 21 14:42:02 crc kubenswrapper[4720]: I0121 14:42:02.436786 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-75df998c5f-tnbdz" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.071085 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7b8c8cff46-cbv67" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.879972 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.880031 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.905076 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-ldp4q"] Jan 21 14:42:22 crc kubenswrapper[4720]: E0121 14:42:22.905353 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="488fea59-5b8b-41f0-82c4-e148ffe21d66" containerName="registry-server" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.905376 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="488fea59-5b8b-41f0-82c4-e148ffe21d66" containerName="registry-server" Jan 21 14:42:22 crc kubenswrapper[4720]: E0121 14:42:22.905392 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="488fea59-5b8b-41f0-82c4-e148ffe21d66" containerName="extract-content" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.905401 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="488fea59-5b8b-41f0-82c4-e148ffe21d66" containerName="extract-content" Jan 21 14:42:22 crc kubenswrapper[4720]: E0121 14:42:22.905422 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="488fea59-5b8b-41f0-82c4-e148ffe21d66" containerName="extract-utilities" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.905431 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="488fea59-5b8b-41f0-82c4-e148ffe21d66" containerName="extract-utilities" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.905551 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="488fea59-5b8b-41f0-82c4-e148ffe21d66" containerName="registry-server" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.907951 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.911158 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.911492 4720 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-kn5dr" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.959200 4720 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.969467 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-lsrs9"] Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.970070 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-lsrs9" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.971556 4720 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.979025 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h66h6\" (UniqueName: \"kubernetes.io/projected/bc431866-4baf-47fc-8767-705a11b9bea0-kube-api-access-h66h6\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.979064 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/bc431866-4baf-47fc-8767-705a11b9bea0-frr-conf\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.979084 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/bc431866-4baf-47fc-8767-705a11b9bea0-frr-startup\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.979111 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/bc431866-4baf-47fc-8767-705a11b9bea0-metrics\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.979137 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/bc431866-4baf-47fc-8767-705a11b9bea0-reloader\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.979153 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/bc431866-4baf-47fc-8767-705a11b9bea0-frr-sockets\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.980619 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bc431866-4baf-47fc-8767-705a11b9bea0-metrics-certs\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.996671 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-lsrs9"] Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.081431 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h66h6\" (UniqueName: \"kubernetes.io/projected/bc431866-4baf-47fc-8767-705a11b9bea0-kube-api-access-h66h6\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.081479 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/bc431866-4baf-47fc-8767-705a11b9bea0-frr-conf\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.081507 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/bc431866-4baf-47fc-8767-705a11b9bea0-frr-startup\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.081547 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/bc431866-4baf-47fc-8767-705a11b9bea0-metrics\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.081581 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/bc431866-4baf-47fc-8767-705a11b9bea0-reloader\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.081600 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/bc431866-4baf-47fc-8767-705a11b9bea0-frr-sockets\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.081632 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ba45f1e-4559-4408-b129-b061d406fce6-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-lsrs9\" (UID: \"8ba45f1e-4559-4408-b129-b061d406fce6\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-lsrs9" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.081668 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bc431866-4baf-47fc-8767-705a11b9bea0-metrics-certs\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.081697 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7tpg\" (UniqueName: \"kubernetes.io/projected/8ba45f1e-4559-4408-b129-b061d406fce6-kube-api-access-v7tpg\") pod \"frr-k8s-webhook-server-7df86c4f6c-lsrs9\" (UID: \"8ba45f1e-4559-4408-b129-b061d406fce6\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-lsrs9" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.082327 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/bc431866-4baf-47fc-8767-705a11b9bea0-frr-conf\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.083057 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/bc431866-4baf-47fc-8767-705a11b9bea0-frr-startup\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.083242 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/bc431866-4baf-47fc-8767-705a11b9bea0-metrics\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.083402 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/bc431866-4baf-47fc-8767-705a11b9bea0-reloader\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.083563 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/bc431866-4baf-47fc-8767-705a11b9bea0-frr-sockets\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:23 crc kubenswrapper[4720]: E0121 14:42:23.083699 4720 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Jan 21 14:42:23 crc kubenswrapper[4720]: E0121 14:42:23.083743 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc431866-4baf-47fc-8767-705a11b9bea0-metrics-certs podName:bc431866-4baf-47fc-8767-705a11b9bea0 nodeName:}" failed. No retries permitted until 2026-01-21 14:42:23.583727044 +0000 UTC m=+781.492466966 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bc431866-4baf-47fc-8767-705a11b9bea0-metrics-certs") pod "frr-k8s-ldp4q" (UID: "bc431866-4baf-47fc-8767-705a11b9bea0") : secret "frr-k8s-certs-secret" not found Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.090222 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-m7fv6"] Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.091061 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-m7fv6" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.095500 4720 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.095681 4720 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.096977 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.105815 4720 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-ghmgt" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.116609 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6968d8fdc4-72sfn"] Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.129292 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-72sfn" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.147228 4720 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.191568 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h66h6\" (UniqueName: \"kubernetes.io/projected/bc431866-4baf-47fc-8767-705a11b9bea0-kube-api-access-h66h6\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.209793 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ba45f1e-4559-4408-b129-b061d406fce6-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-lsrs9\" (UID: \"8ba45f1e-4559-4408-b129-b061d406fce6\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-lsrs9" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.218020 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7tpg\" (UniqueName: \"kubernetes.io/projected/8ba45f1e-4559-4408-b129-b061d406fce6-kube-api-access-v7tpg\") pod \"frr-k8s-webhook-server-7df86c4f6c-lsrs9\" (UID: \"8ba45f1e-4559-4408-b129-b061d406fce6\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-lsrs9" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.218162 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-metallb-excludel2\") pod \"speaker-m7fv6\" (UID: \"49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1\") " pod="metallb-system/speaker-m7fv6" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.218317 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-metrics-certs\") pod \"speaker-m7fv6\" (UID: \"49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1\") " pod="metallb-system/speaker-m7fv6" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.218446 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-memberlist\") pod \"speaker-m7fv6\" (UID: \"49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1\") " pod="metallb-system/speaker-m7fv6" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.218642 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c45bt\" (UniqueName: \"kubernetes.io/projected/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-kube-api-access-c45bt\") pod \"speaker-m7fv6\" (UID: \"49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1\") " pod="metallb-system/speaker-m7fv6" Jan 21 14:42:23 crc kubenswrapper[4720]: E0121 14:42:23.219347 4720 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Jan 21 14:42:23 crc kubenswrapper[4720]: E0121 14:42:23.220272 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ba45f1e-4559-4408-b129-b061d406fce6-cert podName:8ba45f1e-4559-4408-b129-b061d406fce6 nodeName:}" failed. No retries permitted until 2026-01-21 14:42:23.720252018 +0000 UTC m=+781.628991950 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8ba45f1e-4559-4408-b129-b061d406fce6-cert") pod "frr-k8s-webhook-server-7df86c4f6c-lsrs9" (UID: "8ba45f1e-4559-4408-b129-b061d406fce6") : secret "frr-k8s-webhook-server-cert" not found Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.226366 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-72sfn"] Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.251491 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7tpg\" (UniqueName: \"kubernetes.io/projected/8ba45f1e-4559-4408-b129-b061d406fce6-kube-api-access-v7tpg\") pod \"frr-k8s-webhook-server-7df86c4f6c-lsrs9\" (UID: \"8ba45f1e-4559-4408-b129-b061d406fce6\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-lsrs9" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.324745 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpchb\" (UniqueName: \"kubernetes.io/projected/51379103-8c08-45c6-a0f3-86928d43bd50-kube-api-access-lpchb\") pod \"controller-6968d8fdc4-72sfn\" (UID: \"51379103-8c08-45c6-a0f3-86928d43bd50\") " pod="metallb-system/controller-6968d8fdc4-72sfn" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.324809 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51379103-8c08-45c6-a0f3-86928d43bd50-metrics-certs\") pod \"controller-6968d8fdc4-72sfn\" (UID: \"51379103-8c08-45c6-a0f3-86928d43bd50\") " pod="metallb-system/controller-6968d8fdc4-72sfn" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.324838 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/51379103-8c08-45c6-a0f3-86928d43bd50-cert\") pod \"controller-6968d8fdc4-72sfn\" (UID: \"51379103-8c08-45c6-a0f3-86928d43bd50\") " pod="metallb-system/controller-6968d8fdc4-72sfn" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.324858 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-metallb-excludel2\") pod \"speaker-m7fv6\" (UID: \"49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1\") " pod="metallb-system/speaker-m7fv6" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.324876 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-metrics-certs\") pod \"speaker-m7fv6\" (UID: \"49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1\") " pod="metallb-system/speaker-m7fv6" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.324896 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-memberlist\") pod \"speaker-m7fv6\" (UID: \"49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1\") " pod="metallb-system/speaker-m7fv6" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.324919 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c45bt\" (UniqueName: \"kubernetes.io/projected/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-kube-api-access-c45bt\") pod \"speaker-m7fv6\" (UID: \"49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1\") " pod="metallb-system/speaker-m7fv6" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.325735 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-metallb-excludel2\") pod \"speaker-m7fv6\" (UID: \"49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1\") " pod="metallb-system/speaker-m7fv6" Jan 21 14:42:23 crc kubenswrapper[4720]: E0121 14:42:23.325874 4720 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Jan 21 14:42:23 crc kubenswrapper[4720]: E0121 14:42:23.325977 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-metrics-certs podName:49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1 nodeName:}" failed. No retries permitted until 2026-01-21 14:42:23.825952734 +0000 UTC m=+781.734692666 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-metrics-certs") pod "speaker-m7fv6" (UID: "49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1") : secret "speaker-certs-secret" not found Jan 21 14:42:23 crc kubenswrapper[4720]: E0121 14:42:23.326138 4720 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 21 14:42:23 crc kubenswrapper[4720]: E0121 14:42:23.326475 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-memberlist podName:49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1 nodeName:}" failed. No retries permitted until 2026-01-21 14:42:23.826452687 +0000 UTC m=+781.735192619 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-memberlist") pod "speaker-m7fv6" (UID: "49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1") : secret "metallb-memberlist" not found Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.376722 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c45bt\" (UniqueName: \"kubernetes.io/projected/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-kube-api-access-c45bt\") pod \"speaker-m7fv6\" (UID: \"49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1\") " pod="metallb-system/speaker-m7fv6" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.425800 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/51379103-8c08-45c6-a0f3-86928d43bd50-cert\") pod \"controller-6968d8fdc4-72sfn\" (UID: \"51379103-8c08-45c6-a0f3-86928d43bd50\") " pod="metallb-system/controller-6968d8fdc4-72sfn" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.426024 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpchb\" (UniqueName: \"kubernetes.io/projected/51379103-8c08-45c6-a0f3-86928d43bd50-kube-api-access-lpchb\") pod \"controller-6968d8fdc4-72sfn\" (UID: \"51379103-8c08-45c6-a0f3-86928d43bd50\") " pod="metallb-system/controller-6968d8fdc4-72sfn" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.426089 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51379103-8c08-45c6-a0f3-86928d43bd50-metrics-certs\") pod \"controller-6968d8fdc4-72sfn\" (UID: \"51379103-8c08-45c6-a0f3-86928d43bd50\") " pod="metallb-system/controller-6968d8fdc4-72sfn" Jan 21 14:42:23 crc kubenswrapper[4720]: E0121 14:42:23.426258 4720 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Jan 21 14:42:23 crc kubenswrapper[4720]: E0121 14:42:23.426327 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51379103-8c08-45c6-a0f3-86928d43bd50-metrics-certs podName:51379103-8c08-45c6-a0f3-86928d43bd50 nodeName:}" failed. No retries permitted until 2026-01-21 14:42:23.926310693 +0000 UTC m=+781.835050635 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/51379103-8c08-45c6-a0f3-86928d43bd50-metrics-certs") pod "controller-6968d8fdc4-72sfn" (UID: "51379103-8c08-45c6-a0f3-86928d43bd50") : secret "controller-certs-secret" not found Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.428463 4720 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.444129 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/51379103-8c08-45c6-a0f3-86928d43bd50-cert\") pod \"controller-6968d8fdc4-72sfn\" (UID: \"51379103-8c08-45c6-a0f3-86928d43bd50\") " pod="metallb-system/controller-6968d8fdc4-72sfn" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.447304 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpchb\" (UniqueName: \"kubernetes.io/projected/51379103-8c08-45c6-a0f3-86928d43bd50-kube-api-access-lpchb\") pod \"controller-6968d8fdc4-72sfn\" (UID: \"51379103-8c08-45c6-a0f3-86928d43bd50\") " pod="metallb-system/controller-6968d8fdc4-72sfn" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.628538 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bc431866-4baf-47fc-8767-705a11b9bea0-metrics-certs\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.632221 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bc431866-4baf-47fc-8767-705a11b9bea0-metrics-certs\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.729759 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ba45f1e-4559-4408-b129-b061d406fce6-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-lsrs9\" (UID: \"8ba45f1e-4559-4408-b129-b061d406fce6\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-lsrs9" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.733912 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ba45f1e-4559-4408-b129-b061d406fce6-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-lsrs9\" (UID: \"8ba45f1e-4559-4408-b129-b061d406fce6\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-lsrs9" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.831094 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-metrics-certs\") pod \"speaker-m7fv6\" (UID: \"49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1\") " pod="metallb-system/speaker-m7fv6" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.831146 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-memberlist\") pod \"speaker-m7fv6\" (UID: \"49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1\") " pod="metallb-system/speaker-m7fv6" Jan 21 14:42:23 crc kubenswrapper[4720]: E0121 14:42:23.831326 4720 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 21 14:42:23 crc kubenswrapper[4720]: E0121 14:42:23.831378 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-memberlist podName:49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1 nodeName:}" failed. No retries permitted until 2026-01-21 14:42:24.831365104 +0000 UTC m=+782.740105026 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-memberlist") pod "speaker-m7fv6" (UID: "49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1") : secret "metallb-memberlist" not found Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.835332 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-metrics-certs\") pod \"speaker-m7fv6\" (UID: \"49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1\") " pod="metallb-system/speaker-m7fv6" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.869387 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.883952 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-lsrs9" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.932876 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51379103-8c08-45c6-a0f3-86928d43bd50-metrics-certs\") pod \"controller-6968d8fdc4-72sfn\" (UID: \"51379103-8c08-45c6-a0f3-86928d43bd50\") " pod="metallb-system/controller-6968d8fdc4-72sfn" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.938041 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51379103-8c08-45c6-a0f3-86928d43bd50-metrics-certs\") pod \"controller-6968d8fdc4-72sfn\" (UID: \"51379103-8c08-45c6-a0f3-86928d43bd50\") " pod="metallb-system/controller-6968d8fdc4-72sfn" Jan 21 14:42:24 crc kubenswrapper[4720]: I0121 14:42:24.132279 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-lsrs9"] Jan 21 14:42:24 crc kubenswrapper[4720]: I0121 14:42:24.193200 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-72sfn" Jan 21 14:42:24 crc kubenswrapper[4720]: I0121 14:42:24.414760 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-72sfn"] Jan 21 14:42:24 crc kubenswrapper[4720]: W0121 14:42:24.424844 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51379103_8c08_45c6_a0f3_86928d43bd50.slice/crio-f76b5a3f05a83026e644970664cec8c56d4acec25f0141a6aca4954c12324e38 WatchSource:0}: Error finding container f76b5a3f05a83026e644970664cec8c56d4acec25f0141a6aca4954c12324e38: Status 404 returned error can't find the container with id f76b5a3f05a83026e644970664cec8c56d4acec25f0141a6aca4954c12324e38 Jan 21 14:42:24 crc kubenswrapper[4720]: I0121 14:42:24.778369 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-lsrs9" event={"ID":"8ba45f1e-4559-4408-b129-b061d406fce6","Type":"ContainerStarted","Data":"c5f0dce0886a0f45f4d580530432e224be426dfd5dc9636c8c812544243b96be"} Jan 21 14:42:24 crc kubenswrapper[4720]: I0121 14:42:24.780008 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-72sfn" event={"ID":"51379103-8c08-45c6-a0f3-86928d43bd50","Type":"ContainerStarted","Data":"033635c4cde1f6b14605870b40852ebc1a586337c5021d0360ca4027e4cf5165"} Jan 21 14:42:24 crc kubenswrapper[4720]: I0121 14:42:24.780074 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-72sfn" event={"ID":"51379103-8c08-45c6-a0f3-86928d43bd50","Type":"ContainerStarted","Data":"69893f24dcc0456f382d452250a564ffa504b03705c1e4a3f37dbd144ceeb914"} Jan 21 14:42:24 crc kubenswrapper[4720]: I0121 14:42:24.780086 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-72sfn" event={"ID":"51379103-8c08-45c6-a0f3-86928d43bd50","Type":"ContainerStarted","Data":"f76b5a3f05a83026e644970664cec8c56d4acec25f0141a6aca4954c12324e38"} Jan 21 14:42:24 crc kubenswrapper[4720]: I0121 14:42:24.780132 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6968d8fdc4-72sfn" Jan 21 14:42:24 crc kubenswrapper[4720]: I0121 14:42:24.781059 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ldp4q" event={"ID":"bc431866-4baf-47fc-8767-705a11b9bea0","Type":"ContainerStarted","Data":"c0c41611c7145e5b0e0ea75fd6f73d19e649dbcc38ead343c57f81de823d7f75"} Jan 21 14:42:24 crc kubenswrapper[4720]: I0121 14:42:24.842296 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-memberlist\") pod \"speaker-m7fv6\" (UID: \"49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1\") " pod="metallb-system/speaker-m7fv6" Jan 21 14:42:24 crc kubenswrapper[4720]: I0121 14:42:24.852709 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-memberlist\") pod \"speaker-m7fv6\" (UID: \"49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1\") " pod="metallb-system/speaker-m7fv6" Jan 21 14:42:24 crc kubenswrapper[4720]: I0121 14:42:24.907939 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-m7fv6" Jan 21 14:42:25 crc kubenswrapper[4720]: I0121 14:42:25.791772 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-m7fv6" event={"ID":"49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1","Type":"ContainerStarted","Data":"f0a13214bb75291d97e3af3d605824f46fd8f1c24699d17f7ab7aae38b99e5c7"} Jan 21 14:42:25 crc kubenswrapper[4720]: I0121 14:42:25.792313 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-m7fv6" event={"ID":"49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1","Type":"ContainerStarted","Data":"c1c8e81afb098b499c69cb88296ce85e15eb366a8a357a1195b822ae21081807"} Jan 21 14:42:25 crc kubenswrapper[4720]: I0121 14:42:25.792329 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-m7fv6" event={"ID":"49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1","Type":"ContainerStarted","Data":"7a51f48a672401516059e87e6889f5918b5cd9e7bd84ffdb833a30530489e5b0"} Jan 21 14:42:25 crc kubenswrapper[4720]: I0121 14:42:25.792605 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-m7fv6" Jan 21 14:42:25 crc kubenswrapper[4720]: I0121 14:42:25.832220 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-m7fv6" podStartSLOduration=2.832188617 podStartE2EDuration="2.832188617s" podCreationTimestamp="2026-01-21 14:42:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:42:25.830532191 +0000 UTC m=+783.739272143" watchObservedRunningTime="2026-01-21 14:42:25.832188617 +0000 UTC m=+783.740928549" Jan 21 14:42:25 crc kubenswrapper[4720]: I0121 14:42:25.835883 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6968d8fdc4-72sfn" podStartSLOduration=2.8358723660000003 podStartE2EDuration="2.835872366s" podCreationTimestamp="2026-01-21 14:42:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:42:24.80642161 +0000 UTC m=+782.715161542" watchObservedRunningTime="2026-01-21 14:42:25.835872366 +0000 UTC m=+783.744612298" Jan 21 14:42:33 crc kubenswrapper[4720]: I0121 14:42:33.848232 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-lsrs9" event={"ID":"8ba45f1e-4559-4408-b129-b061d406fce6","Type":"ContainerStarted","Data":"45ea7c6bfa6c49e5078825ca8fc3138247ff6c611784682a78cfd6b20c23e07c"} Jan 21 14:42:33 crc kubenswrapper[4720]: I0121 14:42:33.848943 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-lsrs9" Jan 21 14:42:33 crc kubenswrapper[4720]: I0121 14:42:33.850373 4720 generic.go:334] "Generic (PLEG): container finished" podID="bc431866-4baf-47fc-8767-705a11b9bea0" containerID="8cbfd64e143a4d969991c6c3868408621d232046abcb9794c8ba70ac78af5ad7" exitCode=0 Jan 21 14:42:33 crc kubenswrapper[4720]: I0121 14:42:33.850440 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ldp4q" event={"ID":"bc431866-4baf-47fc-8767-705a11b9bea0","Type":"ContainerDied","Data":"8cbfd64e143a4d969991c6c3868408621d232046abcb9794c8ba70ac78af5ad7"} Jan 21 14:42:33 crc kubenswrapper[4720]: I0121 14:42:33.883723 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-lsrs9" podStartSLOduration=2.976383444 podStartE2EDuration="11.88369991s" podCreationTimestamp="2026-01-21 14:42:22 +0000 UTC" firstStartedPulling="2026-01-21 14:42:24.140052851 +0000 UTC m=+782.048792783" lastFinishedPulling="2026-01-21 14:42:33.047369317 +0000 UTC m=+790.956109249" observedRunningTime="2026-01-21 14:42:33.876998537 +0000 UTC m=+791.785738479" watchObservedRunningTime="2026-01-21 14:42:33.88369991 +0000 UTC m=+791.792439852" Jan 21 14:42:34 crc kubenswrapper[4720]: I0121 14:42:34.198106 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6968d8fdc4-72sfn" Jan 21 14:42:34 crc kubenswrapper[4720]: I0121 14:42:34.858070 4720 generic.go:334] "Generic (PLEG): container finished" podID="bc431866-4baf-47fc-8767-705a11b9bea0" containerID="d875c689cd516950e8c727164aa1f83a699878b57d4975b5fcdb4e30cfc35b48" exitCode=0 Jan 21 14:42:34 crc kubenswrapper[4720]: I0121 14:42:34.858422 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ldp4q" event={"ID":"bc431866-4baf-47fc-8767-705a11b9bea0","Type":"ContainerDied","Data":"d875c689cd516950e8c727164aa1f83a699878b57d4975b5fcdb4e30cfc35b48"} Jan 21 14:42:35 crc kubenswrapper[4720]: I0121 14:42:35.867787 4720 generic.go:334] "Generic (PLEG): container finished" podID="bc431866-4baf-47fc-8767-705a11b9bea0" containerID="c3f81f517600b572281fe35f7c14ad55fb9dedc54f5d96394ef7c9e491415f51" exitCode=0 Jan 21 14:42:35 crc kubenswrapper[4720]: I0121 14:42:35.867833 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ldp4q" event={"ID":"bc431866-4baf-47fc-8767-705a11b9bea0","Type":"ContainerDied","Data":"c3f81f517600b572281fe35f7c14ad55fb9dedc54f5d96394ef7c9e491415f51"} Jan 21 14:42:37 crc kubenswrapper[4720]: I0121 14:42:37.884381 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ldp4q" event={"ID":"bc431866-4baf-47fc-8767-705a11b9bea0","Type":"ContainerStarted","Data":"75f094c831763e16abd57d3786b4b507697b41e0827e145b51ae0232971ced2f"} Jan 21 14:42:37 crc kubenswrapper[4720]: I0121 14:42:37.884729 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ldp4q" event={"ID":"bc431866-4baf-47fc-8767-705a11b9bea0","Type":"ContainerStarted","Data":"e4b8cfc74428268a6890248629bde07d7690f7af2a1130a3f58fa6e28ae3c8f8"} Jan 21 14:42:37 crc kubenswrapper[4720]: I0121 14:42:37.884742 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ldp4q" event={"ID":"bc431866-4baf-47fc-8767-705a11b9bea0","Type":"ContainerStarted","Data":"44b91dda3a8f5c829dc40c325f4a8638311d1721889cd2f36669f3e53dd12984"} Jan 21 14:42:37 crc kubenswrapper[4720]: I0121 14:42:37.884752 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ldp4q" event={"ID":"bc431866-4baf-47fc-8767-705a11b9bea0","Type":"ContainerStarted","Data":"dd13d9feb684e2f848b5881c989af0162eda63aa8d01ac6762117052094583e8"} Jan 21 14:42:37 crc kubenswrapper[4720]: I0121 14:42:37.884762 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ldp4q" event={"ID":"bc431866-4baf-47fc-8767-705a11b9bea0","Type":"ContainerStarted","Data":"4f5daa2d322c35a72fc0343ceffe578662539dc1c840ade647e38eaef13cb105"} Jan 21 14:42:38 crc kubenswrapper[4720]: I0121 14:42:38.896706 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ldp4q" event={"ID":"bc431866-4baf-47fc-8767-705a11b9bea0","Type":"ContainerStarted","Data":"ebdc542055b717ea624c9c4707f124a6fde6fe9a4cec20285ab032ded8165957"} Jan 21 14:42:38 crc kubenswrapper[4720]: I0121 14:42:38.897056 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:38 crc kubenswrapper[4720]: I0121 14:42:38.924241 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-ldp4q" podStartSLOduration=7.993178997 podStartE2EDuration="16.924222688s" podCreationTimestamp="2026-01-21 14:42:22 +0000 UTC" firstStartedPulling="2026-01-21 14:42:24.155783689 +0000 UTC m=+782.064523621" lastFinishedPulling="2026-01-21 14:42:33.08682738 +0000 UTC m=+790.995567312" observedRunningTime="2026-01-21 14:42:38.92170771 +0000 UTC m=+796.830447672" watchObservedRunningTime="2026-01-21 14:42:38.924222688 +0000 UTC m=+796.832962620" Jan 21 14:42:43 crc kubenswrapper[4720]: I0121 14:42:43.869931 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:43 crc kubenswrapper[4720]: I0121 14:42:43.888382 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-lsrs9" Jan 21 14:42:43 crc kubenswrapper[4720]: I0121 14:42:43.918307 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:44 crc kubenswrapper[4720]: I0121 14:42:44.911759 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-m7fv6" Jan 21 14:42:45 crc kubenswrapper[4720]: I0121 14:42:45.263803 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j5gmc"] Jan 21 14:42:45 crc kubenswrapper[4720]: I0121 14:42:45.265291 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j5gmc" Jan 21 14:42:45 crc kubenswrapper[4720]: I0121 14:42:45.272924 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j5gmc"] Jan 21 14:42:45 crc kubenswrapper[4720]: I0121 14:42:45.367960 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00a18014-031d-42cb-b5b2-c9114b70f910-utilities\") pod \"redhat-marketplace-j5gmc\" (UID: \"00a18014-031d-42cb-b5b2-c9114b70f910\") " pod="openshift-marketplace/redhat-marketplace-j5gmc" Jan 21 14:42:45 crc kubenswrapper[4720]: I0121 14:42:45.368029 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00a18014-031d-42cb-b5b2-c9114b70f910-catalog-content\") pod \"redhat-marketplace-j5gmc\" (UID: \"00a18014-031d-42cb-b5b2-c9114b70f910\") " pod="openshift-marketplace/redhat-marketplace-j5gmc" Jan 21 14:42:45 crc kubenswrapper[4720]: I0121 14:42:45.368559 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nqhv\" (UniqueName: \"kubernetes.io/projected/00a18014-031d-42cb-b5b2-c9114b70f910-kube-api-access-8nqhv\") pod \"redhat-marketplace-j5gmc\" (UID: \"00a18014-031d-42cb-b5b2-c9114b70f910\") " pod="openshift-marketplace/redhat-marketplace-j5gmc" Jan 21 14:42:45 crc kubenswrapper[4720]: I0121 14:42:45.469961 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nqhv\" (UniqueName: \"kubernetes.io/projected/00a18014-031d-42cb-b5b2-c9114b70f910-kube-api-access-8nqhv\") pod \"redhat-marketplace-j5gmc\" (UID: \"00a18014-031d-42cb-b5b2-c9114b70f910\") " pod="openshift-marketplace/redhat-marketplace-j5gmc" Jan 21 14:42:45 crc kubenswrapper[4720]: I0121 14:42:45.470034 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00a18014-031d-42cb-b5b2-c9114b70f910-utilities\") pod \"redhat-marketplace-j5gmc\" (UID: \"00a18014-031d-42cb-b5b2-c9114b70f910\") " pod="openshift-marketplace/redhat-marketplace-j5gmc" Jan 21 14:42:45 crc kubenswrapper[4720]: I0121 14:42:45.470067 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00a18014-031d-42cb-b5b2-c9114b70f910-catalog-content\") pod \"redhat-marketplace-j5gmc\" (UID: \"00a18014-031d-42cb-b5b2-c9114b70f910\") " pod="openshift-marketplace/redhat-marketplace-j5gmc" Jan 21 14:42:45 crc kubenswrapper[4720]: I0121 14:42:45.470547 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00a18014-031d-42cb-b5b2-c9114b70f910-catalog-content\") pod \"redhat-marketplace-j5gmc\" (UID: \"00a18014-031d-42cb-b5b2-c9114b70f910\") " pod="openshift-marketplace/redhat-marketplace-j5gmc" Jan 21 14:42:45 crc kubenswrapper[4720]: I0121 14:42:45.470699 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00a18014-031d-42cb-b5b2-c9114b70f910-utilities\") pod \"redhat-marketplace-j5gmc\" (UID: \"00a18014-031d-42cb-b5b2-c9114b70f910\") " pod="openshift-marketplace/redhat-marketplace-j5gmc" Jan 21 14:42:45 crc kubenswrapper[4720]: I0121 14:42:45.499884 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nqhv\" (UniqueName: \"kubernetes.io/projected/00a18014-031d-42cb-b5b2-c9114b70f910-kube-api-access-8nqhv\") pod \"redhat-marketplace-j5gmc\" (UID: \"00a18014-031d-42cb-b5b2-c9114b70f910\") " pod="openshift-marketplace/redhat-marketplace-j5gmc" Jan 21 14:42:45 crc kubenswrapper[4720]: I0121 14:42:45.579407 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j5gmc" Jan 21 14:42:45 crc kubenswrapper[4720]: I0121 14:42:45.830898 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j5gmc"] Jan 21 14:42:45 crc kubenswrapper[4720]: I0121 14:42:45.939132 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j5gmc" event={"ID":"00a18014-031d-42cb-b5b2-c9114b70f910","Type":"ContainerStarted","Data":"e6b964d8585fa7b58ed6a33e8427e5c8bd074516b39249e500cc2ba8378e5880"} Jan 21 14:42:46 crc kubenswrapper[4720]: I0121 14:42:46.945673 4720 generic.go:334] "Generic (PLEG): container finished" podID="00a18014-031d-42cb-b5b2-c9114b70f910" containerID="b975f06fb33742d6dda57bac42eeac59eb01583c78d1824aeaf4afd4891cc411" exitCode=0 Jan 21 14:42:46 crc kubenswrapper[4720]: I0121 14:42:46.945730 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j5gmc" event={"ID":"00a18014-031d-42cb-b5b2-c9114b70f910","Type":"ContainerDied","Data":"b975f06fb33742d6dda57bac42eeac59eb01583c78d1824aeaf4afd4891cc411"} Jan 21 14:42:47 crc kubenswrapper[4720]: I0121 14:42:47.952471 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j5gmc" event={"ID":"00a18014-031d-42cb-b5b2-c9114b70f910","Type":"ContainerStarted","Data":"c9e505501da00ddaabf3c07df03f508ccad5e3d4eed18bcb9fba921f9836aa42"} Jan 21 14:42:48 crc kubenswrapper[4720]: I0121 14:42:48.959475 4720 generic.go:334] "Generic (PLEG): container finished" podID="00a18014-031d-42cb-b5b2-c9114b70f910" containerID="c9e505501da00ddaabf3c07df03f508ccad5e3d4eed18bcb9fba921f9836aa42" exitCode=0 Jan 21 14:42:48 crc kubenswrapper[4720]: I0121 14:42:48.959535 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j5gmc" event={"ID":"00a18014-031d-42cb-b5b2-c9114b70f910","Type":"ContainerDied","Data":"c9e505501da00ddaabf3c07df03f508ccad5e3d4eed18bcb9fba921f9836aa42"} Jan 21 14:42:49 crc kubenswrapper[4720]: I0121 14:42:49.968677 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j5gmc" event={"ID":"00a18014-031d-42cb-b5b2-c9114b70f910","Type":"ContainerStarted","Data":"68336fe93cda775095a1fc141e22ec482346446b1afe63b04f8eb00f2e53d32d"} Jan 21 14:42:49 crc kubenswrapper[4720]: I0121 14:42:49.992802 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j5gmc" podStartSLOduration=2.38661926 podStartE2EDuration="4.992787581s" podCreationTimestamp="2026-01-21 14:42:45 +0000 UTC" firstStartedPulling="2026-01-21 14:42:46.947770342 +0000 UTC m=+804.856510274" lastFinishedPulling="2026-01-21 14:42:49.553938673 +0000 UTC m=+807.462678595" observedRunningTime="2026-01-21 14:42:49.987359954 +0000 UTC m=+807.896099956" watchObservedRunningTime="2026-01-21 14:42:49.992787581 +0000 UTC m=+807.901527513" Jan 21 14:42:51 crc kubenswrapper[4720]: I0121 14:42:51.834311 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-rgdp7"] Jan 21 14:42:51 crc kubenswrapper[4720]: I0121 14:42:51.835213 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rgdp7" Jan 21 14:42:51 crc kubenswrapper[4720]: I0121 14:42:51.837709 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-sn7bq" Jan 21 14:42:51 crc kubenswrapper[4720]: I0121 14:42:51.838118 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Jan 21 14:42:51 crc kubenswrapper[4720]: I0121 14:42:51.840518 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Jan 21 14:42:51 crc kubenswrapper[4720]: I0121 14:42:51.846618 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rgdp7"] Jan 21 14:42:51 crc kubenswrapper[4720]: I0121 14:42:51.973360 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4kz7\" (UniqueName: \"kubernetes.io/projected/f29e3816-0dc6-4e24-80ce-3f0669a92a8a-kube-api-access-m4kz7\") pod \"openstack-operator-index-rgdp7\" (UID: \"f29e3816-0dc6-4e24-80ce-3f0669a92a8a\") " pod="openstack-operators/openstack-operator-index-rgdp7" Jan 21 14:42:52 crc kubenswrapper[4720]: I0121 14:42:52.075372 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4kz7\" (UniqueName: \"kubernetes.io/projected/f29e3816-0dc6-4e24-80ce-3f0669a92a8a-kube-api-access-m4kz7\") pod \"openstack-operator-index-rgdp7\" (UID: \"f29e3816-0dc6-4e24-80ce-3f0669a92a8a\") " pod="openstack-operators/openstack-operator-index-rgdp7" Jan 21 14:42:52 crc kubenswrapper[4720]: I0121 14:42:52.094451 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4kz7\" (UniqueName: \"kubernetes.io/projected/f29e3816-0dc6-4e24-80ce-3f0669a92a8a-kube-api-access-m4kz7\") pod \"openstack-operator-index-rgdp7\" (UID: \"f29e3816-0dc6-4e24-80ce-3f0669a92a8a\") " pod="openstack-operators/openstack-operator-index-rgdp7" Jan 21 14:42:52 crc kubenswrapper[4720]: I0121 14:42:52.153410 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rgdp7" Jan 21 14:42:52 crc kubenswrapper[4720]: I0121 14:42:52.350792 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rgdp7"] Jan 21 14:42:52 crc kubenswrapper[4720]: I0121 14:42:52.880746 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:42:52 crc kubenswrapper[4720]: I0121 14:42:52.881076 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:42:52 crc kubenswrapper[4720]: I0121 14:42:52.986455 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rgdp7" event={"ID":"f29e3816-0dc6-4e24-80ce-3f0669a92a8a","Type":"ContainerStarted","Data":"1d71d45e77d980126f90f6eb7ecc522605b8c38e60ba5198007f7f8090fab602"} Jan 21 14:42:53 crc kubenswrapper[4720]: I0121 14:42:53.872984 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:55 crc kubenswrapper[4720]: I0121 14:42:55.579780 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j5gmc" Jan 21 14:42:55 crc kubenswrapper[4720]: I0121 14:42:55.580111 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j5gmc" Jan 21 14:42:55 crc kubenswrapper[4720]: I0121 14:42:55.638901 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j5gmc" Jan 21 14:42:56 crc kubenswrapper[4720]: I0121 14:42:56.071643 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j5gmc" Jan 21 14:42:56 crc kubenswrapper[4720]: I0121 14:42:56.829982 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j5gmc"] Jan 21 14:42:57 crc kubenswrapper[4720]: I0121 14:42:57.033132 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-rgdp7"] Jan 21 14:42:57 crc kubenswrapper[4720]: I0121 14:42:57.633934 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-j4xn9"] Jan 21 14:42:57 crc kubenswrapper[4720]: I0121 14:42:57.634675 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-j4xn9" Jan 21 14:42:57 crc kubenswrapper[4720]: I0121 14:42:57.650070 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-j4xn9"] Jan 21 14:42:57 crc kubenswrapper[4720]: I0121 14:42:57.752064 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkdj7\" (UniqueName: \"kubernetes.io/projected/5d59157d-f538-4cb0-959d-11584d7678c5-kube-api-access-vkdj7\") pod \"openstack-operator-index-j4xn9\" (UID: \"5d59157d-f538-4cb0-959d-11584d7678c5\") " pod="openstack-operators/openstack-operator-index-j4xn9" Jan 21 14:42:57 crc kubenswrapper[4720]: I0121 14:42:57.853177 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkdj7\" (UniqueName: \"kubernetes.io/projected/5d59157d-f538-4cb0-959d-11584d7678c5-kube-api-access-vkdj7\") pod \"openstack-operator-index-j4xn9\" (UID: \"5d59157d-f538-4cb0-959d-11584d7678c5\") " pod="openstack-operators/openstack-operator-index-j4xn9" Jan 21 14:42:57 crc kubenswrapper[4720]: I0121 14:42:57.880147 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkdj7\" (UniqueName: \"kubernetes.io/projected/5d59157d-f538-4cb0-959d-11584d7678c5-kube-api-access-vkdj7\") pod \"openstack-operator-index-j4xn9\" (UID: \"5d59157d-f538-4cb0-959d-11584d7678c5\") " pod="openstack-operators/openstack-operator-index-j4xn9" Jan 21 14:42:57 crc kubenswrapper[4720]: I0121 14:42:57.956812 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-j4xn9" Jan 21 14:42:58 crc kubenswrapper[4720]: I0121 14:42:58.022919 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j5gmc" podUID="00a18014-031d-42cb-b5b2-c9114b70f910" containerName="registry-server" containerID="cri-o://68336fe93cda775095a1fc141e22ec482346446b1afe63b04f8eb00f2e53d32d" gracePeriod=2 Jan 21 14:42:58 crc kubenswrapper[4720]: I0121 14:42:58.398473 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-j4xn9"] Jan 21 14:42:58 crc kubenswrapper[4720]: W0121 14:42:58.406577 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d59157d_f538_4cb0_959d_11584d7678c5.slice/crio-cf258f1812b1296d33db5bf6f5c41c4b001b5de2d627d9afac40a23184e4922a WatchSource:0}: Error finding container cf258f1812b1296d33db5bf6f5c41c4b001b5de2d627d9afac40a23184e4922a: Status 404 returned error can't find the container with id cf258f1812b1296d33db5bf6f5c41c4b001b5de2d627d9afac40a23184e4922a Jan 21 14:42:59 crc kubenswrapper[4720]: I0121 14:42:59.034717 4720 generic.go:334] "Generic (PLEG): container finished" podID="00a18014-031d-42cb-b5b2-c9114b70f910" containerID="68336fe93cda775095a1fc141e22ec482346446b1afe63b04f8eb00f2e53d32d" exitCode=0 Jan 21 14:42:59 crc kubenswrapper[4720]: I0121 14:42:59.034726 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j5gmc" event={"ID":"00a18014-031d-42cb-b5b2-c9114b70f910","Type":"ContainerDied","Data":"68336fe93cda775095a1fc141e22ec482346446b1afe63b04f8eb00f2e53d32d"} Jan 21 14:42:59 crc kubenswrapper[4720]: I0121 14:42:59.036999 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-j4xn9" event={"ID":"5d59157d-f538-4cb0-959d-11584d7678c5","Type":"ContainerStarted","Data":"cf258f1812b1296d33db5bf6f5c41c4b001b5de2d627d9afac40a23184e4922a"} Jan 21 14:43:00 crc kubenswrapper[4720]: I0121 14:43:00.191716 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j5gmc" Jan 21 14:43:00 crc kubenswrapper[4720]: I0121 14:43:00.289580 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00a18014-031d-42cb-b5b2-c9114b70f910-catalog-content\") pod \"00a18014-031d-42cb-b5b2-c9114b70f910\" (UID: \"00a18014-031d-42cb-b5b2-c9114b70f910\") " Jan 21 14:43:00 crc kubenswrapper[4720]: I0121 14:43:00.290014 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00a18014-031d-42cb-b5b2-c9114b70f910-utilities\") pod \"00a18014-031d-42cb-b5b2-c9114b70f910\" (UID: \"00a18014-031d-42cb-b5b2-c9114b70f910\") " Jan 21 14:43:00 crc kubenswrapper[4720]: I0121 14:43:00.290097 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nqhv\" (UniqueName: \"kubernetes.io/projected/00a18014-031d-42cb-b5b2-c9114b70f910-kube-api-access-8nqhv\") pod \"00a18014-031d-42cb-b5b2-c9114b70f910\" (UID: \"00a18014-031d-42cb-b5b2-c9114b70f910\") " Jan 21 14:43:00 crc kubenswrapper[4720]: I0121 14:43:00.291021 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00a18014-031d-42cb-b5b2-c9114b70f910-utilities" (OuterVolumeSpecName: "utilities") pod "00a18014-031d-42cb-b5b2-c9114b70f910" (UID: "00a18014-031d-42cb-b5b2-c9114b70f910"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:43:00 crc kubenswrapper[4720]: I0121 14:43:00.295196 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00a18014-031d-42cb-b5b2-c9114b70f910-kube-api-access-8nqhv" (OuterVolumeSpecName: "kube-api-access-8nqhv") pod "00a18014-031d-42cb-b5b2-c9114b70f910" (UID: "00a18014-031d-42cb-b5b2-c9114b70f910"). InnerVolumeSpecName "kube-api-access-8nqhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:43:00 crc kubenswrapper[4720]: I0121 14:43:00.329388 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00a18014-031d-42cb-b5b2-c9114b70f910-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "00a18014-031d-42cb-b5b2-c9114b70f910" (UID: "00a18014-031d-42cb-b5b2-c9114b70f910"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:43:00 crc kubenswrapper[4720]: I0121 14:43:00.391273 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00a18014-031d-42cb-b5b2-c9114b70f910-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 14:43:00 crc kubenswrapper[4720]: I0121 14:43:00.391326 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00a18014-031d-42cb-b5b2-c9114b70f910-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 14:43:00 crc kubenswrapper[4720]: I0121 14:43:00.391346 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nqhv\" (UniqueName: \"kubernetes.io/projected/00a18014-031d-42cb-b5b2-c9114b70f910-kube-api-access-8nqhv\") on node \"crc\" DevicePath \"\"" Jan 21 14:43:01 crc kubenswrapper[4720]: I0121 14:43:01.051799 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j5gmc" event={"ID":"00a18014-031d-42cb-b5b2-c9114b70f910","Type":"ContainerDied","Data":"e6b964d8585fa7b58ed6a33e8427e5c8bd074516b39249e500cc2ba8378e5880"} Jan 21 14:43:01 crc kubenswrapper[4720]: I0121 14:43:01.051845 4720 scope.go:117] "RemoveContainer" containerID="68336fe93cda775095a1fc141e22ec482346446b1afe63b04f8eb00f2e53d32d" Jan 21 14:43:01 crc kubenswrapper[4720]: I0121 14:43:01.051950 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j5gmc" Jan 21 14:43:01 crc kubenswrapper[4720]: I0121 14:43:01.077894 4720 scope.go:117] "RemoveContainer" containerID="c9e505501da00ddaabf3c07df03f508ccad5e3d4eed18bcb9fba921f9836aa42" Jan 21 14:43:01 crc kubenswrapper[4720]: I0121 14:43:01.087782 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j5gmc"] Jan 21 14:43:01 crc kubenswrapper[4720]: I0121 14:43:01.113287 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j5gmc"] Jan 21 14:43:01 crc kubenswrapper[4720]: I0121 14:43:01.121612 4720 scope.go:117] "RemoveContainer" containerID="b975f06fb33742d6dda57bac42eeac59eb01583c78d1824aeaf4afd4891cc411" Jan 21 14:43:02 crc kubenswrapper[4720]: I0121 14:43:02.690018 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00a18014-031d-42cb-b5b2-c9114b70f910" path="/var/lib/kubelet/pods/00a18014-031d-42cb-b5b2-c9114b70f910/volumes" Jan 21 14:43:06 crc kubenswrapper[4720]: I0121 14:43:06.436038 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dqgp6"] Jan 21 14:43:06 crc kubenswrapper[4720]: E0121 14:43:06.436867 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00a18014-031d-42cb-b5b2-c9114b70f910" containerName="extract-utilities" Jan 21 14:43:06 crc kubenswrapper[4720]: I0121 14:43:06.436897 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="00a18014-031d-42cb-b5b2-c9114b70f910" containerName="extract-utilities" Jan 21 14:43:06 crc kubenswrapper[4720]: E0121 14:43:06.436924 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00a18014-031d-42cb-b5b2-c9114b70f910" containerName="extract-content" Jan 21 14:43:06 crc kubenswrapper[4720]: I0121 14:43:06.436939 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="00a18014-031d-42cb-b5b2-c9114b70f910" containerName="extract-content" Jan 21 14:43:06 crc kubenswrapper[4720]: E0121 14:43:06.436974 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00a18014-031d-42cb-b5b2-c9114b70f910" containerName="registry-server" Jan 21 14:43:06 crc kubenswrapper[4720]: I0121 14:43:06.436990 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="00a18014-031d-42cb-b5b2-c9114b70f910" containerName="registry-server" Jan 21 14:43:06 crc kubenswrapper[4720]: I0121 14:43:06.437253 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="00a18014-031d-42cb-b5b2-c9114b70f910" containerName="registry-server" Jan 21 14:43:06 crc kubenswrapper[4720]: I0121 14:43:06.439038 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dqgp6" Jan 21 14:43:06 crc kubenswrapper[4720]: I0121 14:43:06.495470 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dqgp6"] Jan 21 14:43:06 crc kubenswrapper[4720]: I0121 14:43:06.505516 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fgxs\" (UniqueName: \"kubernetes.io/projected/81ab0c5a-1ce9-47b5-aa19-ff309d2da011-kube-api-access-5fgxs\") pod \"certified-operators-dqgp6\" (UID: \"81ab0c5a-1ce9-47b5-aa19-ff309d2da011\") " pod="openshift-marketplace/certified-operators-dqgp6" Jan 21 14:43:06 crc kubenswrapper[4720]: I0121 14:43:06.505633 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81ab0c5a-1ce9-47b5-aa19-ff309d2da011-utilities\") pod \"certified-operators-dqgp6\" (UID: \"81ab0c5a-1ce9-47b5-aa19-ff309d2da011\") " pod="openshift-marketplace/certified-operators-dqgp6" Jan 21 14:43:06 crc kubenswrapper[4720]: I0121 14:43:06.505870 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81ab0c5a-1ce9-47b5-aa19-ff309d2da011-catalog-content\") pod \"certified-operators-dqgp6\" (UID: \"81ab0c5a-1ce9-47b5-aa19-ff309d2da011\") " pod="openshift-marketplace/certified-operators-dqgp6" Jan 21 14:43:06 crc kubenswrapper[4720]: I0121 14:43:06.607103 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fgxs\" (UniqueName: \"kubernetes.io/projected/81ab0c5a-1ce9-47b5-aa19-ff309d2da011-kube-api-access-5fgxs\") pod \"certified-operators-dqgp6\" (UID: \"81ab0c5a-1ce9-47b5-aa19-ff309d2da011\") " pod="openshift-marketplace/certified-operators-dqgp6" Jan 21 14:43:06 crc kubenswrapper[4720]: I0121 14:43:06.607188 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81ab0c5a-1ce9-47b5-aa19-ff309d2da011-utilities\") pod \"certified-operators-dqgp6\" (UID: \"81ab0c5a-1ce9-47b5-aa19-ff309d2da011\") " pod="openshift-marketplace/certified-operators-dqgp6" Jan 21 14:43:06 crc kubenswrapper[4720]: I0121 14:43:06.607228 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81ab0c5a-1ce9-47b5-aa19-ff309d2da011-catalog-content\") pod \"certified-operators-dqgp6\" (UID: \"81ab0c5a-1ce9-47b5-aa19-ff309d2da011\") " pod="openshift-marketplace/certified-operators-dqgp6" Jan 21 14:43:06 crc kubenswrapper[4720]: I0121 14:43:06.607735 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81ab0c5a-1ce9-47b5-aa19-ff309d2da011-catalog-content\") pod \"certified-operators-dqgp6\" (UID: \"81ab0c5a-1ce9-47b5-aa19-ff309d2da011\") " pod="openshift-marketplace/certified-operators-dqgp6" Jan 21 14:43:06 crc kubenswrapper[4720]: I0121 14:43:06.607914 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81ab0c5a-1ce9-47b5-aa19-ff309d2da011-utilities\") pod \"certified-operators-dqgp6\" (UID: \"81ab0c5a-1ce9-47b5-aa19-ff309d2da011\") " pod="openshift-marketplace/certified-operators-dqgp6" Jan 21 14:43:06 crc kubenswrapper[4720]: I0121 14:43:06.624599 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fgxs\" (UniqueName: \"kubernetes.io/projected/81ab0c5a-1ce9-47b5-aa19-ff309d2da011-kube-api-access-5fgxs\") pod \"certified-operators-dqgp6\" (UID: \"81ab0c5a-1ce9-47b5-aa19-ff309d2da011\") " pod="openshift-marketplace/certified-operators-dqgp6" Jan 21 14:43:06 crc kubenswrapper[4720]: I0121 14:43:06.771905 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dqgp6" Jan 21 14:43:14 crc kubenswrapper[4720]: I0121 14:43:14.418412 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dqgp6"] Jan 21 14:43:15 crc kubenswrapper[4720]: I0121 14:43:15.143066 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-j4xn9" event={"ID":"5d59157d-f538-4cb0-959d-11584d7678c5","Type":"ContainerStarted","Data":"6b19eeaefd46f15129fe9c2ac59824dc6c5406747555ecf52061bf222fc35d2a"} Jan 21 14:43:15 crc kubenswrapper[4720]: I0121 14:43:15.145141 4720 generic.go:334] "Generic (PLEG): container finished" podID="81ab0c5a-1ce9-47b5-aa19-ff309d2da011" containerID="f3185b287551199007386d54d15929dc7f16a36bafd3fbd778ccdd60bd84304a" exitCode=0 Jan 21 14:43:15 crc kubenswrapper[4720]: I0121 14:43:15.145201 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqgp6" event={"ID":"81ab0c5a-1ce9-47b5-aa19-ff309d2da011","Type":"ContainerDied","Data":"f3185b287551199007386d54d15929dc7f16a36bafd3fbd778ccdd60bd84304a"} Jan 21 14:43:15 crc kubenswrapper[4720]: I0121 14:43:15.145226 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqgp6" event={"ID":"81ab0c5a-1ce9-47b5-aa19-ff309d2da011","Type":"ContainerStarted","Data":"cf6301ab54f5e2a0499597c73edfe33ce7ec8e78073269084732953f390b0d82"} Jan 21 14:43:15 crc kubenswrapper[4720]: I0121 14:43:15.147942 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rgdp7" event={"ID":"f29e3816-0dc6-4e24-80ce-3f0669a92a8a","Type":"ContainerStarted","Data":"0a8b40a86261027ecb7d0f1eece309e11545a8aaf707a9bcbbc1d9470ab3e4ab"} Jan 21 14:43:15 crc kubenswrapper[4720]: I0121 14:43:15.148456 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-rgdp7" podUID="f29e3816-0dc6-4e24-80ce-3f0669a92a8a" containerName="registry-server" containerID="cri-o://0a8b40a86261027ecb7d0f1eece309e11545a8aaf707a9bcbbc1d9470ab3e4ab" gracePeriod=2 Jan 21 14:43:15 crc kubenswrapper[4720]: I0121 14:43:15.165011 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-j4xn9" podStartSLOduration=2.361470482 podStartE2EDuration="18.164988981s" podCreationTimestamp="2026-01-21 14:42:57 +0000 UTC" firstStartedPulling="2026-01-21 14:42:58.409094662 +0000 UTC m=+816.317834604" lastFinishedPulling="2026-01-21 14:43:14.212613171 +0000 UTC m=+832.121353103" observedRunningTime="2026-01-21 14:43:15.160498158 +0000 UTC m=+833.069238090" watchObservedRunningTime="2026-01-21 14:43:15.164988981 +0000 UTC m=+833.073728913" Jan 21 14:43:15 crc kubenswrapper[4720]: I0121 14:43:15.201801 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-rgdp7" podStartSLOduration=2.353681916 podStartE2EDuration="24.201784282s" podCreationTimestamp="2026-01-21 14:42:51 +0000 UTC" firstStartedPulling="2026-01-21 14:42:52.357448293 +0000 UTC m=+810.266188215" lastFinishedPulling="2026-01-21 14:43:14.205550649 +0000 UTC m=+832.114290581" observedRunningTime="2026-01-21 14:43:15.199058488 +0000 UTC m=+833.107798410" watchObservedRunningTime="2026-01-21 14:43:15.201784282 +0000 UTC m=+833.110524214" Jan 21 14:43:15 crc kubenswrapper[4720]: I0121 14:43:15.496340 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rgdp7" Jan 21 14:43:15 crc kubenswrapper[4720]: I0121 14:43:15.518164 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4kz7\" (UniqueName: \"kubernetes.io/projected/f29e3816-0dc6-4e24-80ce-3f0669a92a8a-kube-api-access-m4kz7\") pod \"f29e3816-0dc6-4e24-80ce-3f0669a92a8a\" (UID: \"f29e3816-0dc6-4e24-80ce-3f0669a92a8a\") " Jan 21 14:43:15 crc kubenswrapper[4720]: I0121 14:43:15.523604 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f29e3816-0dc6-4e24-80ce-3f0669a92a8a-kube-api-access-m4kz7" (OuterVolumeSpecName: "kube-api-access-m4kz7") pod "f29e3816-0dc6-4e24-80ce-3f0669a92a8a" (UID: "f29e3816-0dc6-4e24-80ce-3f0669a92a8a"). InnerVolumeSpecName "kube-api-access-m4kz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:43:15 crc kubenswrapper[4720]: I0121 14:43:15.619757 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4kz7\" (UniqueName: \"kubernetes.io/projected/f29e3816-0dc6-4e24-80ce-3f0669a92a8a-kube-api-access-m4kz7\") on node \"crc\" DevicePath \"\"" Jan 21 14:43:16 crc kubenswrapper[4720]: I0121 14:43:16.154690 4720 generic.go:334] "Generic (PLEG): container finished" podID="f29e3816-0dc6-4e24-80ce-3f0669a92a8a" containerID="0a8b40a86261027ecb7d0f1eece309e11545a8aaf707a9bcbbc1d9470ab3e4ab" exitCode=0 Jan 21 14:43:16 crc kubenswrapper[4720]: I0121 14:43:16.154772 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rgdp7" Jan 21 14:43:16 crc kubenswrapper[4720]: I0121 14:43:16.154785 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rgdp7" event={"ID":"f29e3816-0dc6-4e24-80ce-3f0669a92a8a","Type":"ContainerDied","Data":"0a8b40a86261027ecb7d0f1eece309e11545a8aaf707a9bcbbc1d9470ab3e4ab"} Jan 21 14:43:16 crc kubenswrapper[4720]: I0121 14:43:16.154818 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rgdp7" event={"ID":"f29e3816-0dc6-4e24-80ce-3f0669a92a8a","Type":"ContainerDied","Data":"1d71d45e77d980126f90f6eb7ecc522605b8c38e60ba5198007f7f8090fab602"} Jan 21 14:43:16 crc kubenswrapper[4720]: I0121 14:43:16.154838 4720 scope.go:117] "RemoveContainer" containerID="0a8b40a86261027ecb7d0f1eece309e11545a8aaf707a9bcbbc1d9470ab3e4ab" Jan 21 14:43:16 crc kubenswrapper[4720]: I0121 14:43:16.157816 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqgp6" event={"ID":"81ab0c5a-1ce9-47b5-aa19-ff309d2da011","Type":"ContainerStarted","Data":"4ab987573bcd60c27d8b0864e21351f96decf9ea681813ab599d3092013563e9"} Jan 21 14:43:16 crc kubenswrapper[4720]: I0121 14:43:16.171471 4720 scope.go:117] "RemoveContainer" containerID="0a8b40a86261027ecb7d0f1eece309e11545a8aaf707a9bcbbc1d9470ab3e4ab" Jan 21 14:43:16 crc kubenswrapper[4720]: E0121 14:43:16.173085 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a8b40a86261027ecb7d0f1eece309e11545a8aaf707a9bcbbc1d9470ab3e4ab\": container with ID starting with 0a8b40a86261027ecb7d0f1eece309e11545a8aaf707a9bcbbc1d9470ab3e4ab not found: ID does not exist" containerID="0a8b40a86261027ecb7d0f1eece309e11545a8aaf707a9bcbbc1d9470ab3e4ab" Jan 21 14:43:16 crc kubenswrapper[4720]: I0121 14:43:16.173228 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a8b40a86261027ecb7d0f1eece309e11545a8aaf707a9bcbbc1d9470ab3e4ab"} err="failed to get container status \"0a8b40a86261027ecb7d0f1eece309e11545a8aaf707a9bcbbc1d9470ab3e4ab\": rpc error: code = NotFound desc = could not find container \"0a8b40a86261027ecb7d0f1eece309e11545a8aaf707a9bcbbc1d9470ab3e4ab\": container with ID starting with 0a8b40a86261027ecb7d0f1eece309e11545a8aaf707a9bcbbc1d9470ab3e4ab not found: ID does not exist" Jan 21 14:43:16 crc kubenswrapper[4720]: I0121 14:43:16.204571 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-rgdp7"] Jan 21 14:43:16 crc kubenswrapper[4720]: I0121 14:43:16.212519 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-rgdp7"] Jan 21 14:43:16 crc kubenswrapper[4720]: I0121 14:43:16.686241 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f29e3816-0dc6-4e24-80ce-3f0669a92a8a" path="/var/lib/kubelet/pods/f29e3816-0dc6-4e24-80ce-3f0669a92a8a/volumes" Jan 21 14:43:17 crc kubenswrapper[4720]: I0121 14:43:17.167501 4720 generic.go:334] "Generic (PLEG): container finished" podID="81ab0c5a-1ce9-47b5-aa19-ff309d2da011" containerID="4ab987573bcd60c27d8b0864e21351f96decf9ea681813ab599d3092013563e9" exitCode=0 Jan 21 14:43:17 crc kubenswrapper[4720]: I0121 14:43:17.167611 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqgp6" event={"ID":"81ab0c5a-1ce9-47b5-aa19-ff309d2da011","Type":"ContainerDied","Data":"4ab987573bcd60c27d8b0864e21351f96decf9ea681813ab599d3092013563e9"} Jan 21 14:43:17 crc kubenswrapper[4720]: I0121 14:43:17.957180 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-j4xn9" Jan 21 14:43:17 crc kubenswrapper[4720]: I0121 14:43:17.957852 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-j4xn9" Jan 21 14:43:17 crc kubenswrapper[4720]: I0121 14:43:17.987086 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-j4xn9" Jan 21 14:43:18 crc kubenswrapper[4720]: I0121 14:43:18.175233 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqgp6" event={"ID":"81ab0c5a-1ce9-47b5-aa19-ff309d2da011","Type":"ContainerStarted","Data":"b3f99d4f2ce2bbfe7e8fc446a2db45eb7f41e089f226ab025d38620841376b4e"} Jan 21 14:43:18 crc kubenswrapper[4720]: I0121 14:43:18.190481 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dqgp6" podStartSLOduration=9.664984352 podStartE2EDuration="12.190463039s" podCreationTimestamp="2026-01-21 14:43:06 +0000 UTC" firstStartedPulling="2026-01-21 14:43:15.147805673 +0000 UTC m=+833.056545615" lastFinishedPulling="2026-01-21 14:43:17.67328435 +0000 UTC m=+835.582024302" observedRunningTime="2026-01-21 14:43:18.189931875 +0000 UTC m=+836.098671817" watchObservedRunningTime="2026-01-21 14:43:18.190463039 +0000 UTC m=+836.099202981" Jan 21 14:43:19 crc kubenswrapper[4720]: I0121 14:43:19.204255 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-j4xn9" Jan 21 14:43:21 crc kubenswrapper[4720]: I0121 14:43:21.873501 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g"] Jan 21 14:43:21 crc kubenswrapper[4720]: E0121 14:43:21.873886 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f29e3816-0dc6-4e24-80ce-3f0669a92a8a" containerName="registry-server" Jan 21 14:43:21 crc kubenswrapper[4720]: I0121 14:43:21.873904 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="f29e3816-0dc6-4e24-80ce-3f0669a92a8a" containerName="registry-server" Jan 21 14:43:21 crc kubenswrapper[4720]: I0121 14:43:21.874058 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="f29e3816-0dc6-4e24-80ce-3f0669a92a8a" containerName="registry-server" Jan 21 14:43:21 crc kubenswrapper[4720]: I0121 14:43:21.875053 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g" Jan 21 14:43:21 crc kubenswrapper[4720]: I0121 14:43:21.877078 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-zr58t" Jan 21 14:43:21 crc kubenswrapper[4720]: I0121 14:43:21.891026 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g"] Jan 21 14:43:21 crc kubenswrapper[4720]: I0121 14:43:21.901346 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/533f904c-bfa5-42e7-a907-5fe372443d20-util\") pod \"cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g\" (UID: \"533f904c-bfa5-42e7-a907-5fe372443d20\") " pod="openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g" Jan 21 14:43:21 crc kubenswrapper[4720]: I0121 14:43:21.901407 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd6xq\" (UniqueName: \"kubernetes.io/projected/533f904c-bfa5-42e7-a907-5fe372443d20-kube-api-access-bd6xq\") pod \"cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g\" (UID: \"533f904c-bfa5-42e7-a907-5fe372443d20\") " pod="openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g" Jan 21 14:43:21 crc kubenswrapper[4720]: I0121 14:43:21.901446 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/533f904c-bfa5-42e7-a907-5fe372443d20-bundle\") pod \"cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g\" (UID: \"533f904c-bfa5-42e7-a907-5fe372443d20\") " pod="openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g" Jan 21 14:43:22 crc kubenswrapper[4720]: I0121 14:43:22.002413 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/533f904c-bfa5-42e7-a907-5fe372443d20-util\") pod \"cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g\" (UID: \"533f904c-bfa5-42e7-a907-5fe372443d20\") " pod="openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g" Jan 21 14:43:22 crc kubenswrapper[4720]: I0121 14:43:22.002456 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd6xq\" (UniqueName: \"kubernetes.io/projected/533f904c-bfa5-42e7-a907-5fe372443d20-kube-api-access-bd6xq\") pod \"cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g\" (UID: \"533f904c-bfa5-42e7-a907-5fe372443d20\") " pod="openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g" Jan 21 14:43:22 crc kubenswrapper[4720]: I0121 14:43:22.002494 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/533f904c-bfa5-42e7-a907-5fe372443d20-bundle\") pod \"cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g\" (UID: \"533f904c-bfa5-42e7-a907-5fe372443d20\") " pod="openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g" Jan 21 14:43:22 crc kubenswrapper[4720]: I0121 14:43:22.002935 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/533f904c-bfa5-42e7-a907-5fe372443d20-util\") pod \"cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g\" (UID: \"533f904c-bfa5-42e7-a907-5fe372443d20\") " pod="openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g" Jan 21 14:43:22 crc kubenswrapper[4720]: I0121 14:43:22.002964 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/533f904c-bfa5-42e7-a907-5fe372443d20-bundle\") pod \"cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g\" (UID: \"533f904c-bfa5-42e7-a907-5fe372443d20\") " pod="openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g" Jan 21 14:43:22 crc kubenswrapper[4720]: I0121 14:43:22.027035 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd6xq\" (UniqueName: \"kubernetes.io/projected/533f904c-bfa5-42e7-a907-5fe372443d20-kube-api-access-bd6xq\") pod \"cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g\" (UID: \"533f904c-bfa5-42e7-a907-5fe372443d20\") " pod="openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g" Jan 21 14:43:22 crc kubenswrapper[4720]: I0121 14:43:22.192675 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g" Jan 21 14:43:22 crc kubenswrapper[4720]: I0121 14:43:22.654506 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g"] Jan 21 14:43:22 crc kubenswrapper[4720]: I0121 14:43:22.880464 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:43:22 crc kubenswrapper[4720]: I0121 14:43:22.880528 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:43:22 crc kubenswrapper[4720]: I0121 14:43:22.880588 4720 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:43:22 crc kubenswrapper[4720]: I0121 14:43:22.881229 4720 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"75aaa3118f909741ad221a6d4a71b9c6e4e33b0de93fc4cf721b556711ea2c47"} pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 14:43:22 crc kubenswrapper[4720]: I0121 14:43:22.881282 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" containerID="cri-o://75aaa3118f909741ad221a6d4a71b9c6e4e33b0de93fc4cf721b556711ea2c47" gracePeriod=600 Jan 21 14:43:23 crc kubenswrapper[4720]: I0121 14:43:23.218550 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g" event={"ID":"533f904c-bfa5-42e7-a907-5fe372443d20","Type":"ContainerStarted","Data":"e514908471d3b967ee61abab72dafcdb91214af2ee11cfe412af578161525e5a"} Jan 21 14:43:24 crc kubenswrapper[4720]: I0121 14:43:24.226385 4720 generic.go:334] "Generic (PLEG): container finished" podID="533f904c-bfa5-42e7-a907-5fe372443d20" containerID="7156339bbe2ef44793d9c82fb8ad6b72ea97b6fcf7156ca0cf6c708c18ca8d2e" exitCode=0 Jan 21 14:43:24 crc kubenswrapper[4720]: I0121 14:43:24.226473 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g" event={"ID":"533f904c-bfa5-42e7-a907-5fe372443d20","Type":"ContainerDied","Data":"7156339bbe2ef44793d9c82fb8ad6b72ea97b6fcf7156ca0cf6c708c18ca8d2e"} Jan 21 14:43:24 crc kubenswrapper[4720]: I0121 14:43:24.230340 4720 generic.go:334] "Generic (PLEG): container finished" podID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerID="75aaa3118f909741ad221a6d4a71b9c6e4e33b0de93fc4cf721b556711ea2c47" exitCode=0 Jan 21 14:43:24 crc kubenswrapper[4720]: I0121 14:43:24.230368 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerDied","Data":"75aaa3118f909741ad221a6d4a71b9c6e4e33b0de93fc4cf721b556711ea2c47"} Jan 21 14:43:24 crc kubenswrapper[4720]: I0121 14:43:24.230388 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerStarted","Data":"533cdaf61eeca84a9c75ff12c4bc63c6833cac28437ed5151fede2f9b5a4f6a6"} Jan 21 14:43:24 crc kubenswrapper[4720]: I0121 14:43:24.230403 4720 scope.go:117] "RemoveContainer" containerID="a61755d2d50927cd3c032bcad351e940f76beb15aa30f49c45cc8f2e261c405c" Jan 21 14:43:25 crc kubenswrapper[4720]: I0121 14:43:25.243834 4720 generic.go:334] "Generic (PLEG): container finished" podID="533f904c-bfa5-42e7-a907-5fe372443d20" containerID="b4ef023ecea2b77b19d79870d43876fa179747ce0c6ae26e6cdf987696ba54d6" exitCode=0 Jan 21 14:43:25 crc kubenswrapper[4720]: I0121 14:43:25.243921 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g" event={"ID":"533f904c-bfa5-42e7-a907-5fe372443d20","Type":"ContainerDied","Data":"b4ef023ecea2b77b19d79870d43876fa179747ce0c6ae26e6cdf987696ba54d6"} Jan 21 14:43:26 crc kubenswrapper[4720]: I0121 14:43:26.253022 4720 generic.go:334] "Generic (PLEG): container finished" podID="533f904c-bfa5-42e7-a907-5fe372443d20" containerID="c7076facda3e1c647765188f6a34fd22ef222fa1b3f88427381505c6288daf09" exitCode=0 Jan 21 14:43:26 crc kubenswrapper[4720]: I0121 14:43:26.253238 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g" event={"ID":"533f904c-bfa5-42e7-a907-5fe372443d20","Type":"ContainerDied","Data":"c7076facda3e1c647765188f6a34fd22ef222fa1b3f88427381505c6288daf09"} Jan 21 14:43:26 crc kubenswrapper[4720]: I0121 14:43:26.772432 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dqgp6" Jan 21 14:43:26 crc kubenswrapper[4720]: I0121 14:43:26.772480 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dqgp6" Jan 21 14:43:26 crc kubenswrapper[4720]: I0121 14:43:26.813019 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dqgp6" Jan 21 14:43:27 crc kubenswrapper[4720]: I0121 14:43:27.303285 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dqgp6" Jan 21 14:43:27 crc kubenswrapper[4720]: I0121 14:43:27.498370 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g" Jan 21 14:43:27 crc kubenswrapper[4720]: I0121 14:43:27.690216 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bd6xq\" (UniqueName: \"kubernetes.io/projected/533f904c-bfa5-42e7-a907-5fe372443d20-kube-api-access-bd6xq\") pod \"533f904c-bfa5-42e7-a907-5fe372443d20\" (UID: \"533f904c-bfa5-42e7-a907-5fe372443d20\") " Jan 21 14:43:27 crc kubenswrapper[4720]: I0121 14:43:27.690297 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/533f904c-bfa5-42e7-a907-5fe372443d20-bundle\") pod \"533f904c-bfa5-42e7-a907-5fe372443d20\" (UID: \"533f904c-bfa5-42e7-a907-5fe372443d20\") " Jan 21 14:43:27 crc kubenswrapper[4720]: I0121 14:43:27.690397 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/533f904c-bfa5-42e7-a907-5fe372443d20-util\") pod \"533f904c-bfa5-42e7-a907-5fe372443d20\" (UID: \"533f904c-bfa5-42e7-a907-5fe372443d20\") " Jan 21 14:43:27 crc kubenswrapper[4720]: I0121 14:43:27.691313 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/533f904c-bfa5-42e7-a907-5fe372443d20-bundle" (OuterVolumeSpecName: "bundle") pod "533f904c-bfa5-42e7-a907-5fe372443d20" (UID: "533f904c-bfa5-42e7-a907-5fe372443d20"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:43:27 crc kubenswrapper[4720]: I0121 14:43:27.695189 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/533f904c-bfa5-42e7-a907-5fe372443d20-kube-api-access-bd6xq" (OuterVolumeSpecName: "kube-api-access-bd6xq") pod "533f904c-bfa5-42e7-a907-5fe372443d20" (UID: "533f904c-bfa5-42e7-a907-5fe372443d20"). InnerVolumeSpecName "kube-api-access-bd6xq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:43:27 crc kubenswrapper[4720]: I0121 14:43:27.715795 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/533f904c-bfa5-42e7-a907-5fe372443d20-util" (OuterVolumeSpecName: "util") pod "533f904c-bfa5-42e7-a907-5fe372443d20" (UID: "533f904c-bfa5-42e7-a907-5fe372443d20"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:43:27 crc kubenswrapper[4720]: I0121 14:43:27.791911 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bd6xq\" (UniqueName: \"kubernetes.io/projected/533f904c-bfa5-42e7-a907-5fe372443d20-kube-api-access-bd6xq\") on node \"crc\" DevicePath \"\"" Jan 21 14:43:27 crc kubenswrapper[4720]: I0121 14:43:27.791957 4720 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/533f904c-bfa5-42e7-a907-5fe372443d20-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:43:27 crc kubenswrapper[4720]: I0121 14:43:27.791976 4720 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/533f904c-bfa5-42e7-a907-5fe372443d20-util\") on node \"crc\" DevicePath \"\"" Jan 21 14:43:28 crc kubenswrapper[4720]: I0121 14:43:28.268182 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g" Jan 21 14:43:28 crc kubenswrapper[4720]: I0121 14:43:28.268351 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g" event={"ID":"533f904c-bfa5-42e7-a907-5fe372443d20","Type":"ContainerDied","Data":"e514908471d3b967ee61abab72dafcdb91214af2ee11cfe412af578161525e5a"} Jan 21 14:43:28 crc kubenswrapper[4720]: I0121 14:43:28.269122 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e514908471d3b967ee61abab72dafcdb91214af2ee11cfe412af578161525e5a" Jan 21 14:43:29 crc kubenswrapper[4720]: I0121 14:43:29.227092 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dqgp6"] Jan 21 14:43:29 crc kubenswrapper[4720]: I0121 14:43:29.272440 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dqgp6" podUID="81ab0c5a-1ce9-47b5-aa19-ff309d2da011" containerName="registry-server" containerID="cri-o://b3f99d4f2ce2bbfe7e8fc446a2db45eb7f41e089f226ab025d38620841376b4e" gracePeriod=2 Jan 21 14:43:29 crc kubenswrapper[4720]: I0121 14:43:29.789330 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dqgp6" Jan 21 14:43:29 crc kubenswrapper[4720]: I0121 14:43:29.830085 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81ab0c5a-1ce9-47b5-aa19-ff309d2da011-utilities\") pod \"81ab0c5a-1ce9-47b5-aa19-ff309d2da011\" (UID: \"81ab0c5a-1ce9-47b5-aa19-ff309d2da011\") " Jan 21 14:43:29 crc kubenswrapper[4720]: I0121 14:43:29.830195 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81ab0c5a-1ce9-47b5-aa19-ff309d2da011-catalog-content\") pod \"81ab0c5a-1ce9-47b5-aa19-ff309d2da011\" (UID: \"81ab0c5a-1ce9-47b5-aa19-ff309d2da011\") " Jan 21 14:43:29 crc kubenswrapper[4720]: I0121 14:43:29.830286 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fgxs\" (UniqueName: \"kubernetes.io/projected/81ab0c5a-1ce9-47b5-aa19-ff309d2da011-kube-api-access-5fgxs\") pod \"81ab0c5a-1ce9-47b5-aa19-ff309d2da011\" (UID: \"81ab0c5a-1ce9-47b5-aa19-ff309d2da011\") " Jan 21 14:43:29 crc kubenswrapper[4720]: I0121 14:43:29.839619 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81ab0c5a-1ce9-47b5-aa19-ff309d2da011-utilities" (OuterVolumeSpecName: "utilities") pod "81ab0c5a-1ce9-47b5-aa19-ff309d2da011" (UID: "81ab0c5a-1ce9-47b5-aa19-ff309d2da011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:43:29 crc kubenswrapper[4720]: I0121 14:43:29.844361 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81ab0c5a-1ce9-47b5-aa19-ff309d2da011-kube-api-access-5fgxs" (OuterVolumeSpecName: "kube-api-access-5fgxs") pod "81ab0c5a-1ce9-47b5-aa19-ff309d2da011" (UID: "81ab0c5a-1ce9-47b5-aa19-ff309d2da011"). InnerVolumeSpecName "kube-api-access-5fgxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:43:29 crc kubenswrapper[4720]: I0121 14:43:29.891949 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81ab0c5a-1ce9-47b5-aa19-ff309d2da011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "81ab0c5a-1ce9-47b5-aa19-ff309d2da011" (UID: "81ab0c5a-1ce9-47b5-aa19-ff309d2da011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:43:29 crc kubenswrapper[4720]: I0121 14:43:29.932108 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81ab0c5a-1ce9-47b5-aa19-ff309d2da011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 14:43:29 crc kubenswrapper[4720]: I0121 14:43:29.932143 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fgxs\" (UniqueName: \"kubernetes.io/projected/81ab0c5a-1ce9-47b5-aa19-ff309d2da011-kube-api-access-5fgxs\") on node \"crc\" DevicePath \"\"" Jan 21 14:43:29 crc kubenswrapper[4720]: I0121 14:43:29.932154 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81ab0c5a-1ce9-47b5-aa19-ff309d2da011-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 14:43:30 crc kubenswrapper[4720]: I0121 14:43:30.284595 4720 generic.go:334] "Generic (PLEG): container finished" podID="81ab0c5a-1ce9-47b5-aa19-ff309d2da011" containerID="b3f99d4f2ce2bbfe7e8fc446a2db45eb7f41e089f226ab025d38620841376b4e" exitCode=0 Jan 21 14:43:30 crc kubenswrapper[4720]: I0121 14:43:30.284645 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqgp6" event={"ID":"81ab0c5a-1ce9-47b5-aa19-ff309d2da011","Type":"ContainerDied","Data":"b3f99d4f2ce2bbfe7e8fc446a2db45eb7f41e089f226ab025d38620841376b4e"} Jan 21 14:43:30 crc kubenswrapper[4720]: I0121 14:43:30.284729 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqgp6" event={"ID":"81ab0c5a-1ce9-47b5-aa19-ff309d2da011","Type":"ContainerDied","Data":"cf6301ab54f5e2a0499597c73edfe33ce7ec8e78073269084732953f390b0d82"} Jan 21 14:43:30 crc kubenswrapper[4720]: I0121 14:43:30.284750 4720 scope.go:117] "RemoveContainer" containerID="b3f99d4f2ce2bbfe7e8fc446a2db45eb7f41e089f226ab025d38620841376b4e" Jan 21 14:43:30 crc kubenswrapper[4720]: I0121 14:43:30.284774 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dqgp6" Jan 21 14:43:30 crc kubenswrapper[4720]: I0121 14:43:30.313864 4720 scope.go:117] "RemoveContainer" containerID="4ab987573bcd60c27d8b0864e21351f96decf9ea681813ab599d3092013563e9" Jan 21 14:43:30 crc kubenswrapper[4720]: I0121 14:43:30.334830 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dqgp6"] Jan 21 14:43:30 crc kubenswrapper[4720]: I0121 14:43:30.338582 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dqgp6"] Jan 21 14:43:30 crc kubenswrapper[4720]: I0121 14:43:30.347853 4720 scope.go:117] "RemoveContainer" containerID="f3185b287551199007386d54d15929dc7f16a36bafd3fbd778ccdd60bd84304a" Jan 21 14:43:30 crc kubenswrapper[4720]: I0121 14:43:30.367192 4720 scope.go:117] "RemoveContainer" containerID="b3f99d4f2ce2bbfe7e8fc446a2db45eb7f41e089f226ab025d38620841376b4e" Jan 21 14:43:30 crc kubenswrapper[4720]: E0121 14:43:30.367644 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3f99d4f2ce2bbfe7e8fc446a2db45eb7f41e089f226ab025d38620841376b4e\": container with ID starting with b3f99d4f2ce2bbfe7e8fc446a2db45eb7f41e089f226ab025d38620841376b4e not found: ID does not exist" containerID="b3f99d4f2ce2bbfe7e8fc446a2db45eb7f41e089f226ab025d38620841376b4e" Jan 21 14:43:30 crc kubenswrapper[4720]: I0121 14:43:30.367770 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3f99d4f2ce2bbfe7e8fc446a2db45eb7f41e089f226ab025d38620841376b4e"} err="failed to get container status \"b3f99d4f2ce2bbfe7e8fc446a2db45eb7f41e089f226ab025d38620841376b4e\": rpc error: code = NotFound desc = could not find container \"b3f99d4f2ce2bbfe7e8fc446a2db45eb7f41e089f226ab025d38620841376b4e\": container with ID starting with b3f99d4f2ce2bbfe7e8fc446a2db45eb7f41e089f226ab025d38620841376b4e not found: ID does not exist" Jan 21 14:43:30 crc kubenswrapper[4720]: I0121 14:43:30.367812 4720 scope.go:117] "RemoveContainer" containerID="4ab987573bcd60c27d8b0864e21351f96decf9ea681813ab599d3092013563e9" Jan 21 14:43:30 crc kubenswrapper[4720]: E0121 14:43:30.368224 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ab987573bcd60c27d8b0864e21351f96decf9ea681813ab599d3092013563e9\": container with ID starting with 4ab987573bcd60c27d8b0864e21351f96decf9ea681813ab599d3092013563e9 not found: ID does not exist" containerID="4ab987573bcd60c27d8b0864e21351f96decf9ea681813ab599d3092013563e9" Jan 21 14:43:30 crc kubenswrapper[4720]: I0121 14:43:30.368259 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ab987573bcd60c27d8b0864e21351f96decf9ea681813ab599d3092013563e9"} err="failed to get container status \"4ab987573bcd60c27d8b0864e21351f96decf9ea681813ab599d3092013563e9\": rpc error: code = NotFound desc = could not find container \"4ab987573bcd60c27d8b0864e21351f96decf9ea681813ab599d3092013563e9\": container with ID starting with 4ab987573bcd60c27d8b0864e21351f96decf9ea681813ab599d3092013563e9 not found: ID does not exist" Jan 21 14:43:30 crc kubenswrapper[4720]: I0121 14:43:30.368292 4720 scope.go:117] "RemoveContainer" containerID="f3185b287551199007386d54d15929dc7f16a36bafd3fbd778ccdd60bd84304a" Jan 21 14:43:30 crc kubenswrapper[4720]: E0121 14:43:30.368879 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3185b287551199007386d54d15929dc7f16a36bafd3fbd778ccdd60bd84304a\": container with ID starting with f3185b287551199007386d54d15929dc7f16a36bafd3fbd778ccdd60bd84304a not found: ID does not exist" containerID="f3185b287551199007386d54d15929dc7f16a36bafd3fbd778ccdd60bd84304a" Jan 21 14:43:30 crc kubenswrapper[4720]: I0121 14:43:30.368911 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3185b287551199007386d54d15929dc7f16a36bafd3fbd778ccdd60bd84304a"} err="failed to get container status \"f3185b287551199007386d54d15929dc7f16a36bafd3fbd778ccdd60bd84304a\": rpc error: code = NotFound desc = could not find container \"f3185b287551199007386d54d15929dc7f16a36bafd3fbd778ccdd60bd84304a\": container with ID starting with f3185b287551199007386d54d15929dc7f16a36bafd3fbd778ccdd60bd84304a not found: ID does not exist" Jan 21 14:43:30 crc kubenswrapper[4720]: I0121 14:43:30.687003 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81ab0c5a-1ce9-47b5-aa19-ff309d2da011" path="/var/lib/kubelet/pods/81ab0c5a-1ce9-47b5-aa19-ff309d2da011/volumes" Jan 21 14:43:32 crc kubenswrapper[4720]: I0121 14:43:32.368610 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-68fc899677-pbmmn"] Jan 21 14:43:32 crc kubenswrapper[4720]: E0121 14:43:32.368861 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="533f904c-bfa5-42e7-a907-5fe372443d20" containerName="extract" Jan 21 14:43:32 crc kubenswrapper[4720]: I0121 14:43:32.368874 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="533f904c-bfa5-42e7-a907-5fe372443d20" containerName="extract" Jan 21 14:43:32 crc kubenswrapper[4720]: E0121 14:43:32.368893 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="533f904c-bfa5-42e7-a907-5fe372443d20" containerName="util" Jan 21 14:43:32 crc kubenswrapper[4720]: I0121 14:43:32.368901 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="533f904c-bfa5-42e7-a907-5fe372443d20" containerName="util" Jan 21 14:43:32 crc kubenswrapper[4720]: E0121 14:43:32.368913 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81ab0c5a-1ce9-47b5-aa19-ff309d2da011" containerName="registry-server" Jan 21 14:43:32 crc kubenswrapper[4720]: I0121 14:43:32.368921 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="81ab0c5a-1ce9-47b5-aa19-ff309d2da011" containerName="registry-server" Jan 21 14:43:32 crc kubenswrapper[4720]: E0121 14:43:32.368935 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81ab0c5a-1ce9-47b5-aa19-ff309d2da011" containerName="extract-utilities" Jan 21 14:43:32 crc kubenswrapper[4720]: I0121 14:43:32.368943 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="81ab0c5a-1ce9-47b5-aa19-ff309d2da011" containerName="extract-utilities" Jan 21 14:43:32 crc kubenswrapper[4720]: E0121 14:43:32.368958 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="533f904c-bfa5-42e7-a907-5fe372443d20" containerName="pull" Jan 21 14:43:32 crc kubenswrapper[4720]: I0121 14:43:32.368966 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="533f904c-bfa5-42e7-a907-5fe372443d20" containerName="pull" Jan 21 14:43:32 crc kubenswrapper[4720]: E0121 14:43:32.368980 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81ab0c5a-1ce9-47b5-aa19-ff309d2da011" containerName="extract-content" Jan 21 14:43:32 crc kubenswrapper[4720]: I0121 14:43:32.368987 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="81ab0c5a-1ce9-47b5-aa19-ff309d2da011" containerName="extract-content" Jan 21 14:43:32 crc kubenswrapper[4720]: I0121 14:43:32.369139 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="533f904c-bfa5-42e7-a907-5fe372443d20" containerName="extract" Jan 21 14:43:32 crc kubenswrapper[4720]: I0121 14:43:32.369154 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="81ab0c5a-1ce9-47b5-aa19-ff309d2da011" containerName="registry-server" Jan 21 14:43:32 crc kubenswrapper[4720]: I0121 14:43:32.369581 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-68fc899677-pbmmn" Jan 21 14:43:32 crc kubenswrapper[4720]: I0121 14:43:32.371432 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-4ngth" Jan 21 14:43:32 crc kubenswrapper[4720]: I0121 14:43:32.452181 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-68fc899677-pbmmn"] Jan 21 14:43:32 crc kubenswrapper[4720]: I0121 14:43:32.463992 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mth5q\" (UniqueName: \"kubernetes.io/projected/d3800217-b53a-4788-a9d4-8861cfdb68a1-kube-api-access-mth5q\") pod \"openstack-operator-controller-init-68fc899677-pbmmn\" (UID: \"d3800217-b53a-4788-a9d4-8861cfdb68a1\") " pod="openstack-operators/openstack-operator-controller-init-68fc899677-pbmmn" Jan 21 14:43:32 crc kubenswrapper[4720]: I0121 14:43:32.565984 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mth5q\" (UniqueName: \"kubernetes.io/projected/d3800217-b53a-4788-a9d4-8861cfdb68a1-kube-api-access-mth5q\") pod \"openstack-operator-controller-init-68fc899677-pbmmn\" (UID: \"d3800217-b53a-4788-a9d4-8861cfdb68a1\") " pod="openstack-operators/openstack-operator-controller-init-68fc899677-pbmmn" Jan 21 14:43:32 crc kubenswrapper[4720]: I0121 14:43:32.585404 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mth5q\" (UniqueName: \"kubernetes.io/projected/d3800217-b53a-4788-a9d4-8861cfdb68a1-kube-api-access-mth5q\") pod \"openstack-operator-controller-init-68fc899677-pbmmn\" (UID: \"d3800217-b53a-4788-a9d4-8861cfdb68a1\") " pod="openstack-operators/openstack-operator-controller-init-68fc899677-pbmmn" Jan 21 14:43:32 crc kubenswrapper[4720]: I0121 14:43:32.687462 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-68fc899677-pbmmn" Jan 21 14:43:32 crc kubenswrapper[4720]: I0121 14:43:32.991379 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-68fc899677-pbmmn"] Jan 21 14:43:33 crc kubenswrapper[4720]: I0121 14:43:33.310383 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-68fc899677-pbmmn" event={"ID":"d3800217-b53a-4788-a9d4-8861cfdb68a1","Type":"ContainerStarted","Data":"ddd4e1059f51fd84885f344a93627688387ac371ba2ce7278799e9189a4b6973"} Jan 21 14:43:38 crc kubenswrapper[4720]: I0121 14:43:38.349390 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-68fc899677-pbmmn" event={"ID":"d3800217-b53a-4788-a9d4-8861cfdb68a1","Type":"ContainerStarted","Data":"11a85db7a64fafef1e314d06f183216c8b0d9c71c648d6691047e0a3ac0f7043"} Jan 21 14:43:38 crc kubenswrapper[4720]: I0121 14:43:38.349931 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-68fc899677-pbmmn" Jan 21 14:43:38 crc kubenswrapper[4720]: I0121 14:43:38.384076 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-68fc899677-pbmmn" podStartSLOduration=1.4094432000000001 podStartE2EDuration="6.384051129s" podCreationTimestamp="2026-01-21 14:43:32 +0000 UTC" firstStartedPulling="2026-01-21 14:43:33.007100476 +0000 UTC m=+850.915840408" lastFinishedPulling="2026-01-21 14:43:37.981708405 +0000 UTC m=+855.890448337" observedRunningTime="2026-01-21 14:43:38.37929387 +0000 UTC m=+856.288033802" watchObservedRunningTime="2026-01-21 14:43:38.384051129 +0000 UTC m=+856.292791081" Jan 21 14:43:52 crc kubenswrapper[4720]: I0121 14:43:52.691197 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-68fc899677-pbmmn" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.545038 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-9b68f5989-wnzfm"] Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.546432 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-wnzfm" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.547688 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl7q2\" (UniqueName: \"kubernetes.io/projected/b7ea6739-9c38-44a0-a382-8b26e37138fa-kube-api-access-fl7q2\") pod \"cinder-operator-controller-manager-9b68f5989-wnzfm\" (UID: \"b7ea6739-9c38-44a0-a382-8b26e37138fa\") " pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-wnzfm" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.549614 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-48m8m" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.561688 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-9b68f5989-wnzfm"] Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.573364 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7ddb5c749-q2t2m"] Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.574337 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-q2t2m" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.579917 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-zn7jv" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.581295 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-9f958b845-bjn2r"] Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.582211 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-9f958b845-bjn2r" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.584046 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-cgt55" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.593766 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-9f958b845-bjn2r"] Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.603896 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7ddb5c749-q2t2m"] Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.637556 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-c6994669c-gwlgm"] Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.638393 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-c6994669c-gwlgm" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.649437 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twq9l\" (UniqueName: \"kubernetes.io/projected/655f8c6a-4936-45d3-9538-66ee77a050d3-kube-api-access-twq9l\") pod \"barbican-operator-controller-manager-7ddb5c749-q2t2m\" (UID: \"655f8c6a-4936-45d3-9538-66ee77a050d3\") " pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-q2t2m" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.649526 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl7q2\" (UniqueName: \"kubernetes.io/projected/b7ea6739-9c38-44a0-a382-8b26e37138fa-kube-api-access-fl7q2\") pod \"cinder-operator-controller-manager-9b68f5989-wnzfm\" (UID: \"b7ea6739-9c38-44a0-a382-8b26e37138fa\") " pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-wnzfm" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.649586 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ww4p\" (UniqueName: \"kubernetes.io/projected/6c93648a-7076-4d91-ac7a-f389ab1159cc-kube-api-access-7ww4p\") pod \"glance-operator-controller-manager-c6994669c-gwlgm\" (UID: \"6c93648a-7076-4d91-ac7a-f389ab1159cc\") " pod="openstack-operators/glance-operator-controller-manager-c6994669c-gwlgm" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.649626 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2j4j\" (UniqueName: \"kubernetes.io/projected/96218341-1cf7-4aa1-bb9a-7a7abba7a93e-kube-api-access-j2j4j\") pod \"designate-operator-controller-manager-9f958b845-bjn2r\" (UID: \"96218341-1cf7-4aa1-bb9a-7a7abba7a93e\") " pod="openstack-operators/designate-operator-controller-manager-9f958b845-bjn2r" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.685391 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-88wsq" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.685533 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-c6994669c-gwlgm"] Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.696788 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl7q2\" (UniqueName: \"kubernetes.io/projected/b7ea6739-9c38-44a0-a382-8b26e37138fa-kube-api-access-fl7q2\") pod \"cinder-operator-controller-manager-9b68f5989-wnzfm\" (UID: \"b7ea6739-9c38-44a0-a382-8b26e37138fa\") " pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-wnzfm" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.745871 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-vfxfh"] Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.746641 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-vfxfh" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.750552 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twq9l\" (UniqueName: \"kubernetes.io/projected/655f8c6a-4936-45d3-9538-66ee77a050d3-kube-api-access-twq9l\") pod \"barbican-operator-controller-manager-7ddb5c749-q2t2m\" (UID: \"655f8c6a-4936-45d3-9538-66ee77a050d3\") " pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-q2t2m" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.750739 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ww4p\" (UniqueName: \"kubernetes.io/projected/6c93648a-7076-4d91-ac7a-f389ab1159cc-kube-api-access-7ww4p\") pod \"glance-operator-controller-manager-c6994669c-gwlgm\" (UID: \"6c93648a-7076-4d91-ac7a-f389ab1159cc\") " pod="openstack-operators/glance-operator-controller-manager-c6994669c-gwlgm" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.750828 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2j4j\" (UniqueName: \"kubernetes.io/projected/96218341-1cf7-4aa1-bb9a-7a7abba7a93e-kube-api-access-j2j4j\") pod \"designate-operator-controller-manager-9f958b845-bjn2r\" (UID: \"96218341-1cf7-4aa1-bb9a-7a7abba7a93e\") " pod="openstack-operators/designate-operator-controller-manager-9f958b845-bjn2r" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.751486 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-clrhz" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.754533 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-bl4z8"] Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.755414 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-bl4z8" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.757503 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-75pgm" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.767745 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-vfxfh"] Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.774968 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-bl4z8"] Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.797616 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2j4j\" (UniqueName: \"kubernetes.io/projected/96218341-1cf7-4aa1-bb9a-7a7abba7a93e-kube-api-access-j2j4j\") pod \"designate-operator-controller-manager-9f958b845-bjn2r\" (UID: \"96218341-1cf7-4aa1-bb9a-7a7abba7a93e\") " pod="openstack-operators/designate-operator-controller-manager-9f958b845-bjn2r" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.800350 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twq9l\" (UniqueName: \"kubernetes.io/projected/655f8c6a-4936-45d3-9538-66ee77a050d3-kube-api-access-twq9l\") pod \"barbican-operator-controller-manager-7ddb5c749-q2t2m\" (UID: \"655f8c6a-4936-45d3-9538-66ee77a050d3\") " pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-q2t2m" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.831580 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ww4p\" (UniqueName: \"kubernetes.io/projected/6c93648a-7076-4d91-ac7a-f389ab1159cc-kube-api-access-7ww4p\") pod \"glance-operator-controller-manager-c6994669c-gwlgm\" (UID: \"6c93648a-7076-4d91-ac7a-f389ab1159cc\") " pod="openstack-operators/glance-operator-controller-manager-c6994669c-gwlgm" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.852402 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqbcn\" (UniqueName: \"kubernetes.io/projected/071d4469-5b09-49a3-97f4-239d811825a2-kube-api-access-jqbcn\") pod \"horizon-operator-controller-manager-77d5c5b54f-vfxfh\" (UID: \"071d4469-5b09-49a3-97f4-239d811825a2\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-vfxfh" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.863993 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-wnzfm" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.866121 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn"] Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.866834 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.869813 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-zcxwg" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.869984 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.894037 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-q2t2m" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.897121 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-78757b4889-glbt4"] Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.897950 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-glbt4" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.900099 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-767fdc4f47-54hwg"] Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.906012 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-9f958b845-bjn2r" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.906585 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-54hwg" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.911977 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-p4v8k" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.912229 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-hjtz5" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.937032 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn"] Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.948363 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-78757b4889-glbt4"] Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.957995 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-c6994669c-gwlgm" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.959096 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2nlh\" (UniqueName: \"kubernetes.io/projected/9a5569f7-371f-4663-b005-5fdcce36936b-kube-api-access-p2nlh\") pod \"heat-operator-controller-manager-594c8c9d5d-bl4z8\" (UID: \"9a5569f7-371f-4663-b005-5fdcce36936b\") " pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-bl4z8" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.959136 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqbcn\" (UniqueName: \"kubernetes.io/projected/071d4469-5b09-49a3-97f4-239d811825a2-kube-api-access-jqbcn\") pod \"horizon-operator-controller-manager-77d5c5b54f-vfxfh\" (UID: \"071d4469-5b09-49a3-97f4-239d811825a2\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-vfxfh" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.994137 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-767fdc4f47-54hwg"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.028099 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqbcn\" (UniqueName: \"kubernetes.io/projected/071d4469-5b09-49a3-97f4-239d811825a2-kube-api-access-jqbcn\") pod \"horizon-operator-controller-manager-77d5c5b54f-vfxfh\" (UID: \"071d4469-5b09-49a3-97f4-239d811825a2\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-vfxfh" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.043888 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-c87fff755-v4fbm"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.044592 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-v4fbm" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.061560 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-qfvft" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.062035 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-vfxfh" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.062976 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert\") pod \"infra-operator-controller-manager-77c48c7859-xtpbn\" (UID: \"b80cffaf-5853-47ac-b783-c26da64425ff\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.063001 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8fx5\" (UniqueName: \"kubernetes.io/projected/085a2e93-1496-47f3-a7dc-4acae2e201fc-kube-api-access-m8fx5\") pod \"keystone-operator-controller-manager-767fdc4f47-54hwg\" (UID: \"085a2e93-1496-47f3-a7dc-4acae2e201fc\") " pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-54hwg" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.063033 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn2dp\" (UniqueName: \"kubernetes.io/projected/9b467fa8-1984-4659-8873-99c20204b16b-kube-api-access-pn2dp\") pod \"ironic-operator-controller-manager-78757b4889-glbt4\" (UID: \"9b467fa8-1984-4659-8873-99c20204b16b\") " pod="openstack-operators/ironic-operator-controller-manager-78757b4889-glbt4" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.063310 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwk8l\" (UniqueName: \"kubernetes.io/projected/b80cffaf-5853-47ac-b783-c26da64425ff-kube-api-access-lwk8l\") pod \"infra-operator-controller-manager-77c48c7859-xtpbn\" (UID: \"b80cffaf-5853-47ac-b783-c26da64425ff\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.063342 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2nlh\" (UniqueName: \"kubernetes.io/projected/9a5569f7-371f-4663-b005-5fdcce36936b-kube-api-access-p2nlh\") pod \"heat-operator-controller-manager-594c8c9d5d-bl4z8\" (UID: \"9a5569f7-371f-4663-b005-5fdcce36936b\") " pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-bl4z8" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.068048 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-864f6b75bf-n5bwd"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.068883 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-n5bwd" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.093032 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-jq5jw" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.142488 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2nlh\" (UniqueName: \"kubernetes.io/projected/9a5569f7-371f-4663-b005-5fdcce36936b-kube-api-access-p2nlh\") pod \"heat-operator-controller-manager-594c8c9d5d-bl4z8\" (UID: \"9a5569f7-371f-4663-b005-5fdcce36936b\") " pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-bl4z8" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.144728 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-cb4666565-d22bk"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.145748 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-d22bk" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.210332 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert\") pod \"infra-operator-controller-manager-77c48c7859-xtpbn\" (UID: \"b80cffaf-5853-47ac-b783-c26da64425ff\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.210387 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8fx5\" (UniqueName: \"kubernetes.io/projected/085a2e93-1496-47f3-a7dc-4acae2e201fc-kube-api-access-m8fx5\") pod \"keystone-operator-controller-manager-767fdc4f47-54hwg\" (UID: \"085a2e93-1496-47f3-a7dc-4acae2e201fc\") " pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-54hwg" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.210430 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn2dp\" (UniqueName: \"kubernetes.io/projected/9b467fa8-1984-4659-8873-99c20204b16b-kube-api-access-pn2dp\") pod \"ironic-operator-controller-manager-78757b4889-glbt4\" (UID: \"9b467fa8-1984-4659-8873-99c20204b16b\") " pod="openstack-operators/ironic-operator-controller-manager-78757b4889-glbt4" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.210458 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4cv4\" (UniqueName: \"kubernetes.io/projected/589a442f-27a6-4d23-85dd-9e5b1556363f-kube-api-access-t4cv4\") pod \"mariadb-operator-controller-manager-c87fff755-v4fbm\" (UID: \"589a442f-27a6-4d23-85dd-9e5b1556363f\") " pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-v4fbm" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.210493 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwk8l\" (UniqueName: \"kubernetes.io/projected/b80cffaf-5853-47ac-b783-c26da64425ff-kube-api-access-lwk8l\") pod \"infra-operator-controller-manager-77c48c7859-xtpbn\" (UID: \"b80cffaf-5853-47ac-b783-c26da64425ff\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn" Jan 21 14:44:14 crc kubenswrapper[4720]: E0121 14:44:14.210936 4720 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 21 14:44:14 crc kubenswrapper[4720]: E0121 14:44:14.210987 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert podName:b80cffaf-5853-47ac-b783-c26da64425ff nodeName:}" failed. No retries permitted until 2026-01-21 14:44:14.710971345 +0000 UTC m=+892.619711277 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert") pod "infra-operator-controller-manager-77c48c7859-xtpbn" (UID: "b80cffaf-5853-47ac-b783-c26da64425ff") : secret "infra-operator-webhook-server-cert" not found Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.225087 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-vqlzt" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.231623 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-864f6b75bf-n5bwd"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.290623 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-pw4z6"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.295247 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8fx5\" (UniqueName: \"kubernetes.io/projected/085a2e93-1496-47f3-a7dc-4acae2e201fc-kube-api-access-m8fx5\") pod \"keystone-operator-controller-manager-767fdc4f47-54hwg\" (UID: \"085a2e93-1496-47f3-a7dc-4acae2e201fc\") " pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-54hwg" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.308051 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwk8l\" (UniqueName: \"kubernetes.io/projected/b80cffaf-5853-47ac-b783-c26da64425ff-kube-api-access-lwk8l\") pod \"infra-operator-controller-manager-77c48c7859-xtpbn\" (UID: \"b80cffaf-5853-47ac-b783-c26da64425ff\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.311016 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-pw4z6" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.313260 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kjvw\" (UniqueName: \"kubernetes.io/projected/c38df2a4-6626-4b71-9dcd-7ef3003ee693-kube-api-access-5kjvw\") pod \"neutron-operator-controller-manager-cb4666565-d22bk\" (UID: \"c38df2a4-6626-4b71-9dcd-7ef3003ee693\") " pod="openstack-operators/neutron-operator-controller-manager-cb4666565-d22bk" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.313316 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gkdp\" (UniqueName: \"kubernetes.io/projected/370e5a87-5edf-4d48-9b65-335400a84cd2-kube-api-access-7gkdp\") pod \"manila-operator-controller-manager-864f6b75bf-n5bwd\" (UID: \"370e5a87-5edf-4d48-9b65-335400a84cd2\") " pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-n5bwd" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.313351 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4cv4\" (UniqueName: \"kubernetes.io/projected/589a442f-27a6-4d23-85dd-9e5b1556363f-kube-api-access-t4cv4\") pod \"mariadb-operator-controller-manager-c87fff755-v4fbm\" (UID: \"589a442f-27a6-4d23-85dd-9e5b1556363f\") " pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-v4fbm" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.314102 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn2dp\" (UniqueName: \"kubernetes.io/projected/9b467fa8-1984-4659-8873-99c20204b16b-kube-api-access-pn2dp\") pod \"ironic-operator-controller-manager-78757b4889-glbt4\" (UID: \"9b467fa8-1984-4659-8873-99c20204b16b\") " pod="openstack-operators/ironic-operator-controller-manager-78757b4889-glbt4" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.368228 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-8hmvp" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.374468 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-54hwg" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.375234 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-bl4z8" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.383313 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4cv4\" (UniqueName: \"kubernetes.io/projected/589a442f-27a6-4d23-85dd-9e5b1556363f-kube-api-access-t4cv4\") pod \"mariadb-operator-controller-manager-c87fff755-v4fbm\" (UID: \"589a442f-27a6-4d23-85dd-9e5b1556363f\") " pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-v4fbm" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.390623 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-65849867d6-vzzmp"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.391485 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-65849867d6-vzzmp" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.401283 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-btdw9" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.405083 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-v4fbm" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.415890 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-c87fff755-v4fbm"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.416557 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kjvw\" (UniqueName: \"kubernetes.io/projected/c38df2a4-6626-4b71-9dcd-7ef3003ee693-kube-api-access-5kjvw\") pod \"neutron-operator-controller-manager-cb4666565-d22bk\" (UID: \"c38df2a4-6626-4b71-9dcd-7ef3003ee693\") " pod="openstack-operators/neutron-operator-controller-manager-cb4666565-d22bk" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.416592 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gkdp\" (UniqueName: \"kubernetes.io/projected/370e5a87-5edf-4d48-9b65-335400a84cd2-kube-api-access-7gkdp\") pod \"manila-operator-controller-manager-864f6b75bf-n5bwd\" (UID: \"370e5a87-5edf-4d48-9b65-335400a84cd2\") " pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-n5bwd" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.416637 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d8th\" (UniqueName: \"kubernetes.io/projected/9695fd09-d135-426b-a129-66f945d2dd90-kube-api-access-9d8th\") pod \"octavia-operator-controller-manager-7fc9b76cf6-pw4z6\" (UID: \"9695fd09-d135-426b-a129-66f945d2dd90\") " pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-pw4z6" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.454573 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-65849867d6-vzzmp"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.459608 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kjvw\" (UniqueName: \"kubernetes.io/projected/c38df2a4-6626-4b71-9dcd-7ef3003ee693-kube-api-access-5kjvw\") pod \"neutron-operator-controller-manager-cb4666565-d22bk\" (UID: \"c38df2a4-6626-4b71-9dcd-7ef3003ee693\") " pod="openstack-operators/neutron-operator-controller-manager-cb4666565-d22bk" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.475945 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-pw4z6"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.488330 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-cb4666565-d22bk"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.502210 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gkdp\" (UniqueName: \"kubernetes.io/projected/370e5a87-5edf-4d48-9b65-335400a84cd2-kube-api-access-7gkdp\") pod \"manila-operator-controller-manager-864f6b75bf-n5bwd\" (UID: \"370e5a87-5edf-4d48-9b65-335400a84cd2\") " pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-n5bwd" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.510211 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.511119 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.527093 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d8th\" (UniqueName: \"kubernetes.io/projected/9695fd09-d135-426b-a129-66f945d2dd90-kube-api-access-9d8th\") pod \"octavia-operator-controller-manager-7fc9b76cf6-pw4z6\" (UID: \"9695fd09-d135-426b-a129-66f945d2dd90\") " pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-pw4z6" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.527251 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99khf\" (UniqueName: \"kubernetes.io/projected/bb4052fb-6d6d-4e2a-b2f7-ee94c97512b5-kube-api-access-99khf\") pod \"nova-operator-controller-manager-65849867d6-vzzmp\" (UID: \"bb4052fb-6d6d-4e2a-b2f7-ee94c97512b5\") " pod="openstack-operators/nova-operator-controller-manager-65849867d6-vzzmp" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.527109 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.527182 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-7pxct" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.563784 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-55db956ddc-689zh"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.565037 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-689zh" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.569644 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-ntr5l" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.586773 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-d22bk" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.594277 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.596196 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-glbt4" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.628072 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlzcb\" (UniqueName: \"kubernetes.io/projected/88327b24-ce00-4bb4-98d1-24060c6dbf28-kube-api-access-mlzcb\") pod \"ovn-operator-controller-manager-55db956ddc-689zh\" (UID: \"88327b24-ce00-4bb4-98d1-24060c6dbf28\") " pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-689zh" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.628126 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99khf\" (UniqueName: \"kubernetes.io/projected/bb4052fb-6d6d-4e2a-b2f7-ee94c97512b5-kube-api-access-99khf\") pod \"nova-operator-controller-manager-65849867d6-vzzmp\" (UID: \"bb4052fb-6d6d-4e2a-b2f7-ee94c97512b5\") " pod="openstack-operators/nova-operator-controller-manager-65849867d6-vzzmp" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.628205 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88e81fdb-6501-410c-9452-d3ba7f41a30d-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw\" (UID: \"88e81fdb-6501-410c-9452-d3ba7f41a30d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.628257 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cwxc\" (UniqueName: \"kubernetes.io/projected/88e81fdb-6501-410c-9452-d3ba7f41a30d-kube-api-access-7cwxc\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw\" (UID: \"88e81fdb-6501-410c-9452-d3ba7f41a30d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.632778 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-55db956ddc-689zh"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.633551 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d8th\" (UniqueName: \"kubernetes.io/projected/9695fd09-d135-426b-a129-66f945d2dd90-kube-api-access-9d8th\") pod \"octavia-operator-controller-manager-7fc9b76cf6-pw4z6\" (UID: \"9695fd09-d135-426b-a129-66f945d2dd90\") " pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-pw4z6" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.651573 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-85dd56d4cc-4tjlt"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.652444 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-4tjlt" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.672414 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-686df47fcb-2clln"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.672859 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-fmnjf" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.673410 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-2clln" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.680040 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-vc82s" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.687990 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99khf\" (UniqueName: \"kubernetes.io/projected/bb4052fb-6d6d-4e2a-b2f7-ee94c97512b5-kube-api-access-99khf\") pod \"nova-operator-controller-manager-65849867d6-vzzmp\" (UID: \"bb4052fb-6d6d-4e2a-b2f7-ee94c97512b5\") " pod="openstack-operators/nova-operator-controller-manager-65849867d6-vzzmp" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.722927 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-pw4z6" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.730007 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxxf9\" (UniqueName: \"kubernetes.io/projected/a2557af5-c155-4d37-9b9a-f9335cac47b1-kube-api-access-nxxf9\") pod \"swift-operator-controller-manager-85dd56d4cc-4tjlt\" (UID: \"a2557af5-c155-4d37-9b9a-f9335cac47b1\") " pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-4tjlt" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.730061 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlzcb\" (UniqueName: \"kubernetes.io/projected/88327b24-ce00-4bb4-98d1-24060c6dbf28-kube-api-access-mlzcb\") pod \"ovn-operator-controller-manager-55db956ddc-689zh\" (UID: \"88327b24-ce00-4bb4-98d1-24060c6dbf28\") " pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-689zh" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.730101 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msw9s\" (UniqueName: \"kubernetes.io/projected/18ce7f0d-00de-4a92-97f2-743d9057abff-kube-api-access-msw9s\") pod \"placement-operator-controller-manager-686df47fcb-2clln\" (UID: \"18ce7f0d-00de-4a92-97f2-743d9057abff\") " pod="openstack-operators/placement-operator-controller-manager-686df47fcb-2clln" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.733930 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88e81fdb-6501-410c-9452-d3ba7f41a30d-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw\" (UID: \"88e81fdb-6501-410c-9452-d3ba7f41a30d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.734204 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cwxc\" (UniqueName: \"kubernetes.io/projected/88e81fdb-6501-410c-9452-d3ba7f41a30d-kube-api-access-7cwxc\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw\" (UID: \"88e81fdb-6501-410c-9452-d3ba7f41a30d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.734229 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert\") pod \"infra-operator-controller-manager-77c48c7859-xtpbn\" (UID: \"b80cffaf-5853-47ac-b783-c26da64425ff\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn" Jan 21 14:44:14 crc kubenswrapper[4720]: E0121 14:44:14.734141 4720 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 14:44:14 crc kubenswrapper[4720]: E0121 14:44:14.734594 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88e81fdb-6501-410c-9452-d3ba7f41a30d-cert podName:88e81fdb-6501-410c-9452-d3ba7f41a30d nodeName:}" failed. No retries permitted until 2026-01-21 14:44:15.23457975 +0000 UTC m=+893.143319682 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/88e81fdb-6501-410c-9452-d3ba7f41a30d-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" (UID: "88e81fdb-6501-410c-9452-d3ba7f41a30d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 14:44:14 crc kubenswrapper[4720]: E0121 14:44:14.736462 4720 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 21 14:44:14 crc kubenswrapper[4720]: E0121 14:44:14.736502 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert podName:b80cffaf-5853-47ac-b783-c26da64425ff nodeName:}" failed. No retries permitted until 2026-01-21 14:44:15.736483522 +0000 UTC m=+893.645223454 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert") pod "infra-operator-controller-manager-77c48c7859-xtpbn" (UID: "b80cffaf-5853-47ac-b783-c26da64425ff") : secret "infra-operator-webhook-server-cert" not found Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.742194 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-65849867d6-vzzmp" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.755190 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-n5bwd" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.773102 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-85dd56d4cc-4tjlt"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.773310 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-686df47fcb-2clln"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.773372 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-7cd8bc9dbb-xczlv"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.774106 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-xczlv" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.787308 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlzcb\" (UniqueName: \"kubernetes.io/projected/88327b24-ce00-4bb4-98d1-24060c6dbf28-kube-api-access-mlzcb\") pod \"ovn-operator-controller-manager-55db956ddc-689zh\" (UID: \"88327b24-ce00-4bb4-98d1-24060c6dbf28\") " pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-689zh" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.791929 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-lk8rz" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.795544 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cwxc\" (UniqueName: \"kubernetes.io/projected/88e81fdb-6501-410c-9452-d3ba7f41a30d-kube-api-access-7cwxc\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw\" (UID: \"88e81fdb-6501-410c-9452-d3ba7f41a30d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.836010 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmsgp\" (UniqueName: \"kubernetes.io/projected/cd17e86c-5586-4ea9-979d-2c195494fe99-kube-api-access-mmsgp\") pod \"test-operator-controller-manager-7cd8bc9dbb-xczlv\" (UID: \"cd17e86c-5586-4ea9-979d-2c195494fe99\") " pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-xczlv" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.836064 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxxf9\" (UniqueName: \"kubernetes.io/projected/a2557af5-c155-4d37-9b9a-f9335cac47b1-kube-api-access-nxxf9\") pod \"swift-operator-controller-manager-85dd56d4cc-4tjlt\" (UID: \"a2557af5-c155-4d37-9b9a-f9335cac47b1\") " pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-4tjlt" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.836103 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msw9s\" (UniqueName: \"kubernetes.io/projected/18ce7f0d-00de-4a92-97f2-743d9057abff-kube-api-access-msw9s\") pod \"placement-operator-controller-manager-686df47fcb-2clln\" (UID: \"18ce7f0d-00de-4a92-97f2-743d9057abff\") " pod="openstack-operators/placement-operator-controller-manager-686df47fcb-2clln" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.862063 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msw9s\" (UniqueName: \"kubernetes.io/projected/18ce7f0d-00de-4a92-97f2-743d9057abff-kube-api-access-msw9s\") pod \"placement-operator-controller-manager-686df47fcb-2clln\" (UID: \"18ce7f0d-00de-4a92-97f2-743d9057abff\") " pod="openstack-operators/placement-operator-controller-manager-686df47fcb-2clln" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.864469 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-8hrkh"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.873041 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-8hrkh" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.873533 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxxf9\" (UniqueName: \"kubernetes.io/projected/a2557af5-c155-4d37-9b9a-f9335cac47b1-kube-api-access-nxxf9\") pod \"swift-operator-controller-manager-85dd56d4cc-4tjlt\" (UID: \"a2557af5-c155-4d37-9b9a-f9335cac47b1\") " pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-4tjlt" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.877174 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-vptjt" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.880454 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-64cd966744-jfkfq"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.881256 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-jfkfq" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.896441 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-swtlp" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.928102 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-689zh" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.928600 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7cd8bc9dbb-xczlv"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.938289 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcwnf\" (UniqueName: \"kubernetes.io/projected/de2e9655-961c-4250-9852-332dfe335b4a-kube-api-access-hcwnf\") pod \"watcher-operator-controller-manager-64cd966744-jfkfq\" (UID: \"de2e9655-961c-4250-9852-332dfe335b4a\") " pod="openstack-operators/watcher-operator-controller-manager-64cd966744-jfkfq" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.938358 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmsgp\" (UniqueName: \"kubernetes.io/projected/cd17e86c-5586-4ea9-979d-2c195494fe99-kube-api-access-mmsgp\") pod \"test-operator-controller-manager-7cd8bc9dbb-xczlv\" (UID: \"cd17e86c-5586-4ea9-979d-2c195494fe99\") " pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-xczlv" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.938433 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xht7q\" (UniqueName: \"kubernetes.io/projected/a050e31c-3d6d-490c-8f74-637c37c96a5e-kube-api-access-xht7q\") pod \"telemetry-operator-controller-manager-5f8f495fcf-8hrkh\" (UID: \"a050e31c-3d6d-490c-8f74-637c37c96a5e\") " pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-8hrkh" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.952195 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-64cd966744-jfkfq"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.984520 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-4tjlt" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.002870 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmsgp\" (UniqueName: \"kubernetes.io/projected/cd17e86c-5586-4ea9-979d-2c195494fe99-kube-api-access-mmsgp\") pod \"test-operator-controller-manager-7cd8bc9dbb-xczlv\" (UID: \"cd17e86c-5586-4ea9-979d-2c195494fe99\") " pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-xczlv" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.003029 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-8hrkh"] Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.010615 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-2clln" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.040321 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xht7q\" (UniqueName: \"kubernetes.io/projected/a050e31c-3d6d-490c-8f74-637c37c96a5e-kube-api-access-xht7q\") pod \"telemetry-operator-controller-manager-5f8f495fcf-8hrkh\" (UID: \"a050e31c-3d6d-490c-8f74-637c37c96a5e\") " pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-8hrkh" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.040600 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcwnf\" (UniqueName: \"kubernetes.io/projected/de2e9655-961c-4250-9852-332dfe335b4a-kube-api-access-hcwnf\") pod \"watcher-operator-controller-manager-64cd966744-jfkfq\" (UID: \"de2e9655-961c-4250-9852-332dfe335b4a\") " pod="openstack-operators/watcher-operator-controller-manager-64cd966744-jfkfq" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.106082 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xht7q\" (UniqueName: \"kubernetes.io/projected/a050e31c-3d6d-490c-8f74-637c37c96a5e-kube-api-access-xht7q\") pod \"telemetry-operator-controller-manager-5f8f495fcf-8hrkh\" (UID: \"a050e31c-3d6d-490c-8f74-637c37c96a5e\") " pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-8hrkh" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.120899 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcwnf\" (UniqueName: \"kubernetes.io/projected/de2e9655-961c-4250-9852-332dfe335b4a-kube-api-access-hcwnf\") pod \"watcher-operator-controller-manager-64cd966744-jfkfq\" (UID: \"de2e9655-961c-4250-9852-332dfe335b4a\") " pod="openstack-operators/watcher-operator-controller-manager-64cd966744-jfkfq" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.184615 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr"] Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.186488 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.212380 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-xczlv" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.228263 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-9b68f5989-wnzfm"] Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.245995 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.246159 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.246263 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-rgchh" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.248224 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88e81fdb-6501-410c-9452-d3ba7f41a30d-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw\" (UID: \"88e81fdb-6501-410c-9452-d3ba7f41a30d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" Jan 21 14:44:15 crc kubenswrapper[4720]: E0121 14:44:15.248409 4720 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 14:44:15 crc kubenswrapper[4720]: E0121 14:44:15.248457 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88e81fdb-6501-410c-9452-d3ba7f41a30d-cert podName:88e81fdb-6501-410c-9452-d3ba7f41a30d nodeName:}" failed. No retries permitted until 2026-01-21 14:44:16.248443721 +0000 UTC m=+894.157183653 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/88e81fdb-6501-410c-9452-d3ba7f41a30d-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" (UID: "88e81fdb-6501-410c-9452-d3ba7f41a30d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.257023 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-jfkfq" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.263382 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr"] Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.292162 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mm7cg"] Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.293084 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mm7cg" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.295581 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-8hrkh" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.304379 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-cnwp6" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.304918 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mm7cg"] Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.334914 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-c6994669c-gwlgm"] Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.352585 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2pc5\" (UniqueName: \"kubernetes.io/projected/eb81b686-832a-414b-aa66-cf40a72a7427-kube-api-access-x2pc5\") pod \"openstack-operator-controller-manager-d47656bc9-4hjmr\" (UID: \"eb81b686-832a-414b-aa66-cf40a72a7427\") " pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.352675 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-metrics-certs\") pod \"openstack-operator-controller-manager-d47656bc9-4hjmr\" (UID: \"eb81b686-832a-414b-aa66-cf40a72a7427\") " pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.352698 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-webhook-certs\") pod \"openstack-operator-controller-manager-d47656bc9-4hjmr\" (UID: \"eb81b686-832a-414b-aa66-cf40a72a7427\") " pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.410417 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7ddb5c749-q2t2m"] Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.455307 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2pc5\" (UniqueName: \"kubernetes.io/projected/eb81b686-832a-414b-aa66-cf40a72a7427-kube-api-access-x2pc5\") pod \"openstack-operator-controller-manager-d47656bc9-4hjmr\" (UID: \"eb81b686-832a-414b-aa66-cf40a72a7427\") " pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.458693 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-metrics-certs\") pod \"openstack-operator-controller-manager-d47656bc9-4hjmr\" (UID: \"eb81b686-832a-414b-aa66-cf40a72a7427\") " pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.458743 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-webhook-certs\") pod \"openstack-operator-controller-manager-d47656bc9-4hjmr\" (UID: \"eb81b686-832a-414b-aa66-cf40a72a7427\") " pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.458795 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2zp6\" (UniqueName: \"kubernetes.io/projected/8db4bced-5679-43ab-a5c9-ba87574aaa02-kube-api-access-d2zp6\") pod \"rabbitmq-cluster-operator-manager-668c99d594-mm7cg\" (UID: \"8db4bced-5679-43ab-a5c9-ba87574aaa02\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mm7cg" Jan 21 14:44:15 crc kubenswrapper[4720]: E0121 14:44:15.460201 4720 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 21 14:44:15 crc kubenswrapper[4720]: E0121 14:44:15.460261 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-metrics-certs podName:eb81b686-832a-414b-aa66-cf40a72a7427 nodeName:}" failed. No retries permitted until 2026-01-21 14:44:15.960242437 +0000 UTC m=+893.868982369 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-metrics-certs") pod "openstack-operator-controller-manager-d47656bc9-4hjmr" (UID: "eb81b686-832a-414b-aa66-cf40a72a7427") : secret "metrics-server-cert" not found Jan 21 14:44:15 crc kubenswrapper[4720]: E0121 14:44:15.460391 4720 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 21 14:44:15 crc kubenswrapper[4720]: E0121 14:44:15.460424 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-webhook-certs podName:eb81b686-832a-414b-aa66-cf40a72a7427 nodeName:}" failed. No retries permitted until 2026-01-21 14:44:15.960415302 +0000 UTC m=+893.869155234 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-webhook-certs") pod "openstack-operator-controller-manager-d47656bc9-4hjmr" (UID: "eb81b686-832a-414b-aa66-cf40a72a7427") : secret "webhook-server-cert" not found Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.463933 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-9f958b845-bjn2r"] Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.499040 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2pc5\" (UniqueName: \"kubernetes.io/projected/eb81b686-832a-414b-aa66-cf40a72a7427-kube-api-access-x2pc5\") pod \"openstack-operator-controller-manager-d47656bc9-4hjmr\" (UID: \"eb81b686-832a-414b-aa66-cf40a72a7427\") " pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:15 crc kubenswrapper[4720]: W0121 14:44:15.520140 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96218341_1cf7_4aa1_bb9a_7a7abba7a93e.slice/crio-8b2392c4209a5b5a84630e43fe459621efb9dc14702481718a946eb135f39d32 WatchSource:0}: Error finding container 8b2392c4209a5b5a84630e43fe459621efb9dc14702481718a946eb135f39d32: Status 404 returned error can't find the container with id 8b2392c4209a5b5a84630e43fe459621efb9dc14702481718a946eb135f39d32 Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.560578 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2zp6\" (UniqueName: \"kubernetes.io/projected/8db4bced-5679-43ab-a5c9-ba87574aaa02-kube-api-access-d2zp6\") pod \"rabbitmq-cluster-operator-manager-668c99d594-mm7cg\" (UID: \"8db4bced-5679-43ab-a5c9-ba87574aaa02\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mm7cg" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.585358 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2zp6\" (UniqueName: \"kubernetes.io/projected/8db4bced-5679-43ab-a5c9-ba87574aaa02-kube-api-access-d2zp6\") pod \"rabbitmq-cluster-operator-manager-668c99d594-mm7cg\" (UID: \"8db4bced-5679-43ab-a5c9-ba87574aaa02\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mm7cg" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.621536 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-c6994669c-gwlgm" event={"ID":"6c93648a-7076-4d91-ac7a-f389ab1159cc","Type":"ContainerStarted","Data":"4de74b9430eca3d9796788dbf314218cdbdfb3a296b55018ba2792fc21dfd78a"} Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.639708 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-9f958b845-bjn2r" event={"ID":"96218341-1cf7-4aa1-bb9a-7a7abba7a93e","Type":"ContainerStarted","Data":"8b2392c4209a5b5a84630e43fe459621efb9dc14702481718a946eb135f39d32"} Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.640650 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-q2t2m" event={"ID":"655f8c6a-4936-45d3-9538-66ee77a050d3","Type":"ContainerStarted","Data":"467e42d2bfb99b52210143ad72cbe50f704bba71478ce48827add1fd22fe519d"} Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.642638 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-wnzfm" event={"ID":"b7ea6739-9c38-44a0-a382-8b26e37138fa","Type":"ContainerStarted","Data":"2709ab9bd89c9a2bd6e188957fe2bc34182484aa9343c08e775b56940d6423af"} Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.654085 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mm7cg" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.763884 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert\") pod \"infra-operator-controller-manager-77c48c7859-xtpbn\" (UID: \"b80cffaf-5853-47ac-b783-c26da64425ff\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn" Jan 21 14:44:15 crc kubenswrapper[4720]: E0121 14:44:15.764479 4720 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 21 14:44:15 crc kubenswrapper[4720]: E0121 14:44:15.764585 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert podName:b80cffaf-5853-47ac-b783-c26da64425ff nodeName:}" failed. No retries permitted until 2026-01-21 14:44:17.764562233 +0000 UTC m=+895.673302165 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert") pod "infra-operator-controller-manager-77c48c7859-xtpbn" (UID: "b80cffaf-5853-47ac-b783-c26da64425ff") : secret "infra-operator-webhook-server-cert" not found Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.924917 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-c87fff755-v4fbm"] Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.959276 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-vfxfh"] Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.970224 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-metrics-certs\") pod \"openstack-operator-controller-manager-d47656bc9-4hjmr\" (UID: \"eb81b686-832a-414b-aa66-cf40a72a7427\") " pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.970266 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-webhook-certs\") pod \"openstack-operator-controller-manager-d47656bc9-4hjmr\" (UID: \"eb81b686-832a-414b-aa66-cf40a72a7427\") " pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:15 crc kubenswrapper[4720]: E0121 14:44:15.970436 4720 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 21 14:44:15 crc kubenswrapper[4720]: E0121 14:44:15.970483 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-webhook-certs podName:eb81b686-832a-414b-aa66-cf40a72a7427 nodeName:}" failed. No retries permitted until 2026-01-21 14:44:16.970467699 +0000 UTC m=+894.879207631 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-webhook-certs") pod "openstack-operator-controller-manager-d47656bc9-4hjmr" (UID: "eb81b686-832a-414b-aa66-cf40a72a7427") : secret "webhook-server-cert" not found Jan 21 14:44:15 crc kubenswrapper[4720]: E0121 14:44:15.970846 4720 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 21 14:44:15 crc kubenswrapper[4720]: E0121 14:44:15.970872 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-metrics-certs podName:eb81b686-832a-414b-aa66-cf40a72a7427 nodeName:}" failed. No retries permitted until 2026-01-21 14:44:16.97086362 +0000 UTC m=+894.879603552 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-metrics-certs") pod "openstack-operator-controller-manager-d47656bc9-4hjmr" (UID: "eb81b686-832a-414b-aa66-cf40a72a7427") : secret "metrics-server-cert" not found Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.002502 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-767fdc4f47-54hwg"] Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.010343 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-bl4z8"] Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.076540 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-55db956ddc-689zh"] Jan 21 14:44:16 crc kubenswrapper[4720]: W0121 14:44:16.110712 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88327b24_ce00_4bb4_98d1_24060c6dbf28.slice/crio-7694e840a979c59166703c99cb417768ff92878a5362dd1a01731e7599a35fd5 WatchSource:0}: Error finding container 7694e840a979c59166703c99cb417768ff92878a5362dd1a01731e7599a35fd5: Status 404 returned error can't find the container with id 7694e840a979c59166703c99cb417768ff92878a5362dd1a01731e7599a35fd5 Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.113930 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-cb4666565-d22bk"] Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.121731 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-78757b4889-glbt4"] Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.126262 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-pw4z6"] Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.171177 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-65849867d6-vzzmp"] Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.208545 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-864f6b75bf-n5bwd"] Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.218376 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-85dd56d4cc-4tjlt"] Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.227743 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-686df47fcb-2clln"] Jan 21 14:44:16 crc kubenswrapper[4720]: E0121 14:44:16.235280 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:9404536bf7cb7c3818e1a0f92b53e4d7c02fe7942324f32894106f02f8fc7e92,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nxxf9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-85dd56d4cc-4tjlt_openstack-operators(a2557af5-c155-4d37-9b9a-f9335cac47b1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.235413 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-64cd966744-jfkfq"] Jan 21 14:44:16 crc kubenswrapper[4720]: E0121 14:44:16.236480 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-4tjlt" podUID="a2557af5-c155-4d37-9b9a-f9335cac47b1" Jan 21 14:44:16 crc kubenswrapper[4720]: W0121 14:44:16.241876 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde2e9655_961c_4250_9852_332dfe335b4a.slice/crio-bdc37c42c632a0248431aae7514456debbcda86ac5a41fa2bc84cfaa3e4014b6 WatchSource:0}: Error finding container bdc37c42c632a0248431aae7514456debbcda86ac5a41fa2bc84cfaa3e4014b6: Status 404 returned error can't find the container with id bdc37c42c632a0248431aae7514456debbcda86ac5a41fa2bc84cfaa3e4014b6 Jan 21 14:44:16 crc kubenswrapper[4720]: E0121 14:44:16.243582 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d687150a46d97eb382dcd8305a2a611943af74771debe1fa9cc13a21e51c69ad,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hcwnf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-64cd966744-jfkfq_openstack-operators(de2e9655-961c-4250-9852-332dfe335b4a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 21 14:44:16 crc kubenswrapper[4720]: E0121 14:44:16.246753 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-jfkfq" podUID="de2e9655-961c-4250-9852-332dfe335b4a" Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.277773 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88e81fdb-6501-410c-9452-d3ba7f41a30d-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw\" (UID: \"88e81fdb-6501-410c-9452-d3ba7f41a30d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" Jan 21 14:44:16 crc kubenswrapper[4720]: E0121 14:44:16.277974 4720 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 14:44:16 crc kubenswrapper[4720]: E0121 14:44:16.278037 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88e81fdb-6501-410c-9452-d3ba7f41a30d-cert podName:88e81fdb-6501-410c-9452-d3ba7f41a30d nodeName:}" failed. No retries permitted until 2026-01-21 14:44:18.278018862 +0000 UTC m=+896.186758794 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/88e81fdb-6501-410c-9452-d3ba7f41a30d-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" (UID: "88e81fdb-6501-410c-9452-d3ba7f41a30d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.340609 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7cd8bc9dbb-xczlv"] Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.344746 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-8hrkh"] Jan 21 14:44:16 crc kubenswrapper[4720]: W0121 14:44:16.349494 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda050e31c_3d6d_490c_8f74_637c37c96a5e.slice/crio-06d837f977636e63a5658d405a3f48a28bd8320d2a57303fecc15278cfd0e311 WatchSource:0}: Error finding container 06d837f977636e63a5658d405a3f48a28bd8320d2a57303fecc15278cfd0e311: Status 404 returned error can't find the container with id 06d837f977636e63a5658d405a3f48a28bd8320d2a57303fecc15278cfd0e311 Jan 21 14:44:16 crc kubenswrapper[4720]: W0121 14:44:16.359771 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd17e86c_5586_4ea9_979d_2c195494fe99.slice/crio-6598d017c2723c09d88d01d101834943cd3f4611b071025a816a08575d8fcf72 WatchSource:0}: Error finding container 6598d017c2723c09d88d01d101834943cd3f4611b071025a816a08575d8fcf72: Status 404 returned error can't find the container with id 6598d017c2723c09d88d01d101834943cd3f4611b071025a816a08575d8fcf72 Jan 21 14:44:16 crc kubenswrapper[4720]: E0121 14:44:16.363392 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:244a4906353b84899db16a89e1ebb64491c9f85e69327cb2a72b6da0142a6e5e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mmsgp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-7cd8bc9dbb-xczlv_openstack-operators(cd17e86c-5586-4ea9-979d-2c195494fe99): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 21 14:44:16 crc kubenswrapper[4720]: E0121 14:44:16.365808 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-xczlv" podUID="cd17e86c-5586-4ea9-979d-2c195494fe99" Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.375627 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mm7cg"] Jan 21 14:44:16 crc kubenswrapper[4720]: E0121 14:44:16.383403 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-d2zp6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-mm7cg_openstack-operators(8db4bced-5679-43ab-a5c9-ba87574aaa02): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 21 14:44:16 crc kubenswrapper[4720]: E0121 14:44:16.384719 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mm7cg" podUID="8db4bced-5679-43ab-a5c9-ba87574aaa02" Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.663482 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-bl4z8" event={"ID":"9a5569f7-371f-4663-b005-5fdcce36936b","Type":"ContainerStarted","Data":"d85df34e81d1295c94ffdc4e8d4ca0bd384c874626cc4ba5f395a80812343576"} Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.666435 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-jfkfq" event={"ID":"de2e9655-961c-4250-9852-332dfe335b4a","Type":"ContainerStarted","Data":"bdc37c42c632a0248431aae7514456debbcda86ac5a41fa2bc84cfaa3e4014b6"} Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.668106 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-d22bk" event={"ID":"c38df2a4-6626-4b71-9dcd-7ef3003ee693","Type":"ContainerStarted","Data":"ad986fcfad4b159d41f754a788241bae78656828028b3ea9e5f35412c2c6f264"} Jan 21 14:44:16 crc kubenswrapper[4720]: E0121 14:44:16.676291 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d687150a46d97eb382dcd8305a2a611943af74771debe1fa9cc13a21e51c69ad\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-jfkfq" podUID="de2e9655-961c-4250-9852-332dfe335b4a" Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.676694 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mm7cg" event={"ID":"8db4bced-5679-43ab-a5c9-ba87574aaa02","Type":"ContainerStarted","Data":"356ea00aac9afd839cc8b829df740b648a053aed9dbb2c957aeca9749ccd4ef3"} Jan 21 14:44:16 crc kubenswrapper[4720]: E0121 14:44:16.678162 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mm7cg" podUID="8db4bced-5679-43ab-a5c9-ba87574aaa02" Jan 21 14:44:16 crc kubenswrapper[4720]: E0121 14:44:16.682920 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:9404536bf7cb7c3818e1a0f92b53e4d7c02fe7942324f32894106f02f8fc7e92\\\"\"" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-4tjlt" podUID="a2557af5-c155-4d37-9b9a-f9335cac47b1" Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.702320 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-2clln" event={"ID":"18ce7f0d-00de-4a92-97f2-743d9057abff","Type":"ContainerStarted","Data":"64698aed373aa846d9adffb375d485022a13648703c14b7e2fe148020540c9be"} Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.702364 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-4tjlt" event={"ID":"a2557af5-c155-4d37-9b9a-f9335cac47b1","Type":"ContainerStarted","Data":"e23be9095254d2e90d03ef41e4d3a169308157a70bc64dfbc3d8a67df557b34f"} Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.702376 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-689zh" event={"ID":"88327b24-ce00-4bb4-98d1-24060c6dbf28","Type":"ContainerStarted","Data":"7694e840a979c59166703c99cb417768ff92878a5362dd1a01731e7599a35fd5"} Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.702386 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-vfxfh" event={"ID":"071d4469-5b09-49a3-97f4-239d811825a2","Type":"ContainerStarted","Data":"193be7305a6eac486d49dcadcae7aa92048fd21dfcd08a156d4cd642174269ae"} Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.703942 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-pw4z6" event={"ID":"9695fd09-d135-426b-a129-66f945d2dd90","Type":"ContainerStarted","Data":"1b008f060ae4d6cec2a9348eee3da41d87032c67738d60165b931e2544775587"} Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.709827 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-8hrkh" event={"ID":"a050e31c-3d6d-490c-8f74-637c37c96a5e","Type":"ContainerStarted","Data":"06d837f977636e63a5658d405a3f48a28bd8320d2a57303fecc15278cfd0e311"} Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.723722 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-xczlv" event={"ID":"cd17e86c-5586-4ea9-979d-2c195494fe99","Type":"ContainerStarted","Data":"6598d017c2723c09d88d01d101834943cd3f4611b071025a816a08575d8fcf72"} Jan 21 14:44:16 crc kubenswrapper[4720]: E0121 14:44:16.725570 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:244a4906353b84899db16a89e1ebb64491c9f85e69327cb2a72b6da0142a6e5e\\\"\"" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-xczlv" podUID="cd17e86c-5586-4ea9-979d-2c195494fe99" Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.728602 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-54hwg" event={"ID":"085a2e93-1496-47f3-a7dc-4acae2e201fc","Type":"ContainerStarted","Data":"2d211466bdd1649f8c106eac2eb3ce8f05c80b13ca75b726ac9cbaecc9fd6f01"} Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.732300 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-65849867d6-vzzmp" event={"ID":"bb4052fb-6d6d-4e2a-b2f7-ee94c97512b5","Type":"ContainerStarted","Data":"43584b1caac15137c1334b925a24f0376a433f856b9c58bd0b4db09ea7599a13"} Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.741243 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-n5bwd" event={"ID":"370e5a87-5edf-4d48-9b65-335400a84cd2","Type":"ContainerStarted","Data":"00ce8de16f0d8b050f9fea187c22f6dc461f2628f617aacde9fa9fa92fbeb733"} Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.743398 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-v4fbm" event={"ID":"589a442f-27a6-4d23-85dd-9e5b1556363f","Type":"ContainerStarted","Data":"ceb822f4a19e352e475c926fe67026a70f3f3d82505ab054b023f9126c6692c5"} Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.745022 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-glbt4" event={"ID":"9b467fa8-1984-4659-8873-99c20204b16b","Type":"ContainerStarted","Data":"3be16a220f0ce2b9b4d7071b9308f8ebd3d77b11b12a54736d85d3714d83c6b9"} Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.988782 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-metrics-certs\") pod \"openstack-operator-controller-manager-d47656bc9-4hjmr\" (UID: \"eb81b686-832a-414b-aa66-cf40a72a7427\") " pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.989069 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-webhook-certs\") pod \"openstack-operator-controller-manager-d47656bc9-4hjmr\" (UID: \"eb81b686-832a-414b-aa66-cf40a72a7427\") " pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:16 crc kubenswrapper[4720]: E0121 14:44:16.988944 4720 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 21 14:44:16 crc kubenswrapper[4720]: E0121 14:44:16.989204 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-metrics-certs podName:eb81b686-832a-414b-aa66-cf40a72a7427 nodeName:}" failed. No retries permitted until 2026-01-21 14:44:18.989190284 +0000 UTC m=+896.897930216 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-metrics-certs") pod "openstack-operator-controller-manager-d47656bc9-4hjmr" (UID: "eb81b686-832a-414b-aa66-cf40a72a7427") : secret "metrics-server-cert" not found Jan 21 14:44:16 crc kubenswrapper[4720]: E0121 14:44:16.989158 4720 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 21 14:44:16 crc kubenswrapper[4720]: E0121 14:44:16.989546 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-webhook-certs podName:eb81b686-832a-414b-aa66-cf40a72a7427 nodeName:}" failed. No retries permitted until 2026-01-21 14:44:18.989535184 +0000 UTC m=+896.898275106 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-webhook-certs") pod "openstack-operator-controller-manager-d47656bc9-4hjmr" (UID: "eb81b686-832a-414b-aa66-cf40a72a7427") : secret "webhook-server-cert" not found Jan 21 14:44:17 crc kubenswrapper[4720]: E0121 14:44:17.756924 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d687150a46d97eb382dcd8305a2a611943af74771debe1fa9cc13a21e51c69ad\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-jfkfq" podUID="de2e9655-961c-4250-9852-332dfe335b4a" Jan 21 14:44:17 crc kubenswrapper[4720]: E0121 14:44:17.757248 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mm7cg" podUID="8db4bced-5679-43ab-a5c9-ba87574aaa02" Jan 21 14:44:17 crc kubenswrapper[4720]: E0121 14:44:17.757299 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:9404536bf7cb7c3818e1a0f92b53e4d7c02fe7942324f32894106f02f8fc7e92\\\"\"" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-4tjlt" podUID="a2557af5-c155-4d37-9b9a-f9335cac47b1" Jan 21 14:44:17 crc kubenswrapper[4720]: E0121 14:44:17.757365 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:244a4906353b84899db16a89e1ebb64491c9f85e69327cb2a72b6da0142a6e5e\\\"\"" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-xczlv" podUID="cd17e86c-5586-4ea9-979d-2c195494fe99" Jan 21 14:44:17 crc kubenswrapper[4720]: I0121 14:44:17.809376 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert\") pod \"infra-operator-controller-manager-77c48c7859-xtpbn\" (UID: \"b80cffaf-5853-47ac-b783-c26da64425ff\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn" Jan 21 14:44:17 crc kubenswrapper[4720]: E0121 14:44:17.811612 4720 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 21 14:44:17 crc kubenswrapper[4720]: E0121 14:44:17.823820 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert podName:b80cffaf-5853-47ac-b783-c26da64425ff nodeName:}" failed. No retries permitted until 2026-01-21 14:44:21.823771077 +0000 UTC m=+899.732511009 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert") pod "infra-operator-controller-manager-77c48c7859-xtpbn" (UID: "b80cffaf-5853-47ac-b783-c26da64425ff") : secret "infra-operator-webhook-server-cert" not found Jan 21 14:44:18 crc kubenswrapper[4720]: I0121 14:44:18.340109 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88e81fdb-6501-410c-9452-d3ba7f41a30d-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw\" (UID: \"88e81fdb-6501-410c-9452-d3ba7f41a30d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" Jan 21 14:44:18 crc kubenswrapper[4720]: E0121 14:44:18.340274 4720 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 14:44:18 crc kubenswrapper[4720]: E0121 14:44:18.340333 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88e81fdb-6501-410c-9452-d3ba7f41a30d-cert podName:88e81fdb-6501-410c-9452-d3ba7f41a30d nodeName:}" failed. No retries permitted until 2026-01-21 14:44:22.34031613 +0000 UTC m=+900.249056062 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/88e81fdb-6501-410c-9452-d3ba7f41a30d-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" (UID: "88e81fdb-6501-410c-9452-d3ba7f41a30d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 14:44:19 crc kubenswrapper[4720]: I0121 14:44:19.056352 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-metrics-certs\") pod \"openstack-operator-controller-manager-d47656bc9-4hjmr\" (UID: \"eb81b686-832a-414b-aa66-cf40a72a7427\") " pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:19 crc kubenswrapper[4720]: I0121 14:44:19.056397 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-webhook-certs\") pod \"openstack-operator-controller-manager-d47656bc9-4hjmr\" (UID: \"eb81b686-832a-414b-aa66-cf40a72a7427\") " pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:19 crc kubenswrapper[4720]: E0121 14:44:19.057823 4720 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 21 14:44:19 crc kubenswrapper[4720]: E0121 14:44:19.057898 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-webhook-certs podName:eb81b686-832a-414b-aa66-cf40a72a7427 nodeName:}" failed. No retries permitted until 2026-01-21 14:44:23.057881167 +0000 UTC m=+900.966621099 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-webhook-certs") pod "openstack-operator-controller-manager-d47656bc9-4hjmr" (UID: "eb81b686-832a-414b-aa66-cf40a72a7427") : secret "webhook-server-cert" not found Jan 21 14:44:19 crc kubenswrapper[4720]: E0121 14:44:19.058641 4720 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 21 14:44:19 crc kubenswrapper[4720]: E0121 14:44:19.058837 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-metrics-certs podName:eb81b686-832a-414b-aa66-cf40a72a7427 nodeName:}" failed. No retries permitted until 2026-01-21 14:44:23.058813862 +0000 UTC m=+900.967553794 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-metrics-certs") pod "openstack-operator-controller-manager-d47656bc9-4hjmr" (UID: "eb81b686-832a-414b-aa66-cf40a72a7427") : secret "metrics-server-cert" not found Jan 21 14:44:21 crc kubenswrapper[4720]: I0121 14:44:21.911598 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert\") pod \"infra-operator-controller-manager-77c48c7859-xtpbn\" (UID: \"b80cffaf-5853-47ac-b783-c26da64425ff\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn" Jan 21 14:44:21 crc kubenswrapper[4720]: E0121 14:44:21.912000 4720 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 21 14:44:21 crc kubenswrapper[4720]: E0121 14:44:21.912053 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert podName:b80cffaf-5853-47ac-b783-c26da64425ff nodeName:}" failed. No retries permitted until 2026-01-21 14:44:29.912035824 +0000 UTC m=+907.820775756 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert") pod "infra-operator-controller-manager-77c48c7859-xtpbn" (UID: "b80cffaf-5853-47ac-b783-c26da64425ff") : secret "infra-operator-webhook-server-cert" not found Jan 21 14:44:22 crc kubenswrapper[4720]: I0121 14:44:22.418840 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88e81fdb-6501-410c-9452-d3ba7f41a30d-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw\" (UID: \"88e81fdb-6501-410c-9452-d3ba7f41a30d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" Jan 21 14:44:22 crc kubenswrapper[4720]: E0121 14:44:22.418981 4720 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 14:44:22 crc kubenswrapper[4720]: E0121 14:44:22.419028 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88e81fdb-6501-410c-9452-d3ba7f41a30d-cert podName:88e81fdb-6501-410c-9452-d3ba7f41a30d nodeName:}" failed. No retries permitted until 2026-01-21 14:44:30.419014067 +0000 UTC m=+908.327753999 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/88e81fdb-6501-410c-9452-d3ba7f41a30d-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" (UID: "88e81fdb-6501-410c-9452-d3ba7f41a30d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 14:44:23 crc kubenswrapper[4720]: I0121 14:44:23.136953 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-metrics-certs\") pod \"openstack-operator-controller-manager-d47656bc9-4hjmr\" (UID: \"eb81b686-832a-414b-aa66-cf40a72a7427\") " pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:23 crc kubenswrapper[4720]: I0121 14:44:23.138198 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-webhook-certs\") pod \"openstack-operator-controller-manager-d47656bc9-4hjmr\" (UID: \"eb81b686-832a-414b-aa66-cf40a72a7427\") " pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:23 crc kubenswrapper[4720]: E0121 14:44:23.138166 4720 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 21 14:44:23 crc kubenswrapper[4720]: E0121 14:44:23.138514 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-metrics-certs podName:eb81b686-832a-414b-aa66-cf40a72a7427 nodeName:}" failed. No retries permitted until 2026-01-21 14:44:31.138499875 +0000 UTC m=+909.047239807 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-metrics-certs") pod "openstack-operator-controller-manager-d47656bc9-4hjmr" (UID: "eb81b686-832a-414b-aa66-cf40a72a7427") : secret "metrics-server-cert" not found Jan 21 14:44:23 crc kubenswrapper[4720]: E0121 14:44:23.138313 4720 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 21 14:44:23 crc kubenswrapper[4720]: E0121 14:44:23.139001 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-webhook-certs podName:eb81b686-832a-414b-aa66-cf40a72a7427 nodeName:}" failed. No retries permitted until 2026-01-21 14:44:31.138991968 +0000 UTC m=+909.047731900 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-webhook-certs") pod "openstack-operator-controller-manager-d47656bc9-4hjmr" (UID: "eb81b686-832a-414b-aa66-cf40a72a7427") : secret "webhook-server-cert" not found Jan 21 14:44:29 crc kubenswrapper[4720]: E0121 14:44:29.411154 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:ddb59f1a8e3fd0d641405e371e33b3d8c913af08e40e84f390e7e06f0a7f3488" Jan 21 14:44:29 crc kubenswrapper[4720]: E0121 14:44:29.411716 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:ddb59f1a8e3fd0d641405e371e33b3d8c913af08e40e84f390e7e06f0a7f3488,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fl7q2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-9b68f5989-wnzfm_openstack-operators(b7ea6739-9c38-44a0-a382-8b26e37138fa): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:44:29 crc kubenswrapper[4720]: E0121 14:44:29.412912 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-wnzfm" podUID="b7ea6739-9c38-44a0-a382-8b26e37138fa" Jan 21 14:44:29 crc kubenswrapper[4720]: E0121 14:44:29.880758 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:ddb59f1a8e3fd0d641405e371e33b3d8c913af08e40e84f390e7e06f0a7f3488\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-wnzfm" podUID="b7ea6739-9c38-44a0-a382-8b26e37138fa" Jan 21 14:44:29 crc kubenswrapper[4720]: I0121 14:44:29.953533 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert\") pod \"infra-operator-controller-manager-77c48c7859-xtpbn\" (UID: \"b80cffaf-5853-47ac-b783-c26da64425ff\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn" Jan 21 14:44:29 crc kubenswrapper[4720]: E0121 14:44:29.955561 4720 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 21 14:44:29 crc kubenswrapper[4720]: E0121 14:44:29.955695 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert podName:b80cffaf-5853-47ac-b783-c26da64425ff nodeName:}" failed. No retries permitted until 2026-01-21 14:44:45.955630499 +0000 UTC m=+923.864370431 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert") pod "infra-operator-controller-manager-77c48c7859-xtpbn" (UID: "b80cffaf-5853-47ac-b783-c26da64425ff") : secret "infra-operator-webhook-server-cert" not found Jan 21 14:44:30 crc kubenswrapper[4720]: I0121 14:44:30.461269 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88e81fdb-6501-410c-9452-d3ba7f41a30d-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw\" (UID: \"88e81fdb-6501-410c-9452-d3ba7f41a30d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" Jan 21 14:44:30 crc kubenswrapper[4720]: I0121 14:44:30.470789 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88e81fdb-6501-410c-9452-d3ba7f41a30d-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw\" (UID: \"88e81fdb-6501-410c-9452-d3ba7f41a30d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" Jan 21 14:44:30 crc kubenswrapper[4720]: I0121 14:44:30.752626 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-7pxct" Jan 21 14:44:30 crc kubenswrapper[4720]: I0121 14:44:30.761917 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" Jan 21 14:44:31 crc kubenswrapper[4720]: I0121 14:44:31.172602 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-metrics-certs\") pod \"openstack-operator-controller-manager-d47656bc9-4hjmr\" (UID: \"eb81b686-832a-414b-aa66-cf40a72a7427\") " pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:31 crc kubenswrapper[4720]: I0121 14:44:31.172677 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-webhook-certs\") pod \"openstack-operator-controller-manager-d47656bc9-4hjmr\" (UID: \"eb81b686-832a-414b-aa66-cf40a72a7427\") " pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:31 crc kubenswrapper[4720]: I0121 14:44:31.178855 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-metrics-certs\") pod \"openstack-operator-controller-manager-d47656bc9-4hjmr\" (UID: \"eb81b686-832a-414b-aa66-cf40a72a7427\") " pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:31 crc kubenswrapper[4720]: E0121 14:44:31.184866 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:ff0b6c27e2d96afccd73fbbb5b5297a3f60c7f4f1dfd2a877152466697018d71" Jan 21 14:44:31 crc kubenswrapper[4720]: E0121 14:44:31.185035 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:ff0b6c27e2d96afccd73fbbb5b5297a3f60c7f4f1dfd2a877152466697018d71,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-t4cv4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-c87fff755-v4fbm_openstack-operators(589a442f-27a6-4d23-85dd-9e5b1556363f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:44:31 crc kubenswrapper[4720]: E0121 14:44:31.186170 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-v4fbm" podUID="589a442f-27a6-4d23-85dd-9e5b1556363f" Jan 21 14:44:31 crc kubenswrapper[4720]: I0121 14:44:31.186963 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-webhook-certs\") pod \"openstack-operator-controller-manager-d47656bc9-4hjmr\" (UID: \"eb81b686-832a-414b-aa66-cf40a72a7427\") " pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:31 crc kubenswrapper[4720]: I0121 14:44:31.466206 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-rgchh" Jan 21 14:44:31 crc kubenswrapper[4720]: I0121 14:44:31.475409 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:31 crc kubenswrapper[4720]: E0121 14:44:31.829547 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:146961cac3291daf96c1ca2bc7bd52bc94d1f4787a0770e23205c2c9beb0d737" Jan 21 14:44:31 crc kubenswrapper[4720]: E0121 14:44:31.829759 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:146961cac3291daf96c1ca2bc7bd52bc94d1f4787a0770e23205c2c9beb0d737,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-msw9s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-686df47fcb-2clln_openstack-operators(18ce7f0d-00de-4a92-97f2-743d9057abff): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:44:31 crc kubenswrapper[4720]: E0121 14:44:31.830905 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-2clln" podUID="18ce7f0d-00de-4a92-97f2-743d9057abff" Jan 21 14:44:31 crc kubenswrapper[4720]: E0121 14:44:31.887773 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:146961cac3291daf96c1ca2bc7bd52bc94d1f4787a0770e23205c2c9beb0d737\\\"\"" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-2clln" podUID="18ce7f0d-00de-4a92-97f2-743d9057abff" Jan 21 14:44:31 crc kubenswrapper[4720]: E0121 14:44:31.887859 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:ff0b6c27e2d96afccd73fbbb5b5297a3f60c7f4f1dfd2a877152466697018d71\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-v4fbm" podUID="589a442f-27a6-4d23-85dd-9e5b1556363f" Jan 21 14:44:35 crc kubenswrapper[4720]: E0121 14:44:35.142567 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:6defa56fc6a5bfbd5b27d28ff7b1c7bc89b24b2ef956e2a6d97b2726f668a231" Jan 21 14:44:35 crc kubenswrapper[4720]: E0121 14:44:35.143192 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:6defa56fc6a5bfbd5b27d28ff7b1c7bc89b24b2ef956e2a6d97b2726f668a231,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-99khf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-65849867d6-vzzmp_openstack-operators(bb4052fb-6d6d-4e2a-b2f7-ee94c97512b5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:44:35 crc kubenswrapper[4720]: E0121 14:44:35.144728 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-65849867d6-vzzmp" podUID="bb4052fb-6d6d-4e2a-b2f7-ee94c97512b5" Jan 21 14:44:35 crc kubenswrapper[4720]: E0121 14:44:35.919497 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:6defa56fc6a5bfbd5b27d28ff7b1c7bc89b24b2ef956e2a6d97b2726f668a231\\\"\"" pod="openstack-operators/nova-operator-controller-manager-65849867d6-vzzmp" podUID="bb4052fb-6d6d-4e2a-b2f7-ee94c97512b5" Jan 21 14:44:37 crc kubenswrapper[4720]: E0121 14:44:37.499241 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:2f9a2f064448faebbae58f52d564dc0e8e39bed0fc12bd6b9fe925e42f1b5492" Jan 21 14:44:37 crc kubenswrapper[4720]: E0121 14:44:37.499454 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:2f9a2f064448faebbae58f52d564dc0e8e39bed0fc12bd6b9fe925e42f1b5492,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p2nlh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-594c8c9d5d-bl4z8_openstack-operators(9a5569f7-371f-4663-b005-5fdcce36936b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:44:37 crc kubenswrapper[4720]: E0121 14:44:37.500752 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-bl4z8" podUID="9a5569f7-371f-4663-b005-5fdcce36936b" Jan 21 14:44:37 crc kubenswrapper[4720]: E0121 14:44:37.932452 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:2f9a2f064448faebbae58f52d564dc0e8e39bed0fc12bd6b9fe925e42f1b5492\\\"\"" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-bl4z8" podUID="9a5569f7-371f-4663-b005-5fdcce36936b" Jan 21 14:44:38 crc kubenswrapper[4720]: E0121 14:44:38.228973 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:2e89109f5db66abf1afd15ef59bda35a53db40c5e59e020579ac5aa0acea1843" Jan 21 14:44:38 crc kubenswrapper[4720]: E0121 14:44:38.229201 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:2e89109f5db66abf1afd15ef59bda35a53db40c5e59e020579ac5aa0acea1843,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xht7q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5f8f495fcf-8hrkh_openstack-operators(a050e31c-3d6d-490c-8f74-637c37c96a5e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:44:38 crc kubenswrapper[4720]: E0121 14:44:38.230576 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-8hrkh" podUID="a050e31c-3d6d-490c-8f74-637c37c96a5e" Jan 21 14:44:38 crc kubenswrapper[4720]: E0121 14:44:38.914237 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:f0634d8cf7c2c2919ca248a6883ce43d6ae4ac59252c987a5cfe17643fe7d38a" Jan 21 14:44:38 crc kubenswrapper[4720]: E0121 14:44:38.914562 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:f0634d8cf7c2c2919ca248a6883ce43d6ae4ac59252c987a5cfe17643fe7d38a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-twq9l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-7ddb5c749-q2t2m_openstack-operators(655f8c6a-4936-45d3-9538-66ee77a050d3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:44:38 crc kubenswrapper[4720]: E0121 14:44:38.916701 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-q2t2m" podUID="655f8c6a-4936-45d3-9538-66ee77a050d3" Jan 21 14:44:38 crc kubenswrapper[4720]: E0121 14:44:38.936387 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:2e89109f5db66abf1afd15ef59bda35a53db40c5e59e020579ac5aa0acea1843\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-8hrkh" podUID="a050e31c-3d6d-490c-8f74-637c37c96a5e" Jan 21 14:44:38 crc kubenswrapper[4720]: E0121 14:44:38.936929 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:f0634d8cf7c2c2919ca248a6883ce43d6ae4ac59252c987a5cfe17643fe7d38a\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-q2t2m" podUID="655f8c6a-4936-45d3-9538-66ee77a050d3" Jan 21 14:44:39 crc kubenswrapper[4720]: E0121 14:44:39.515931 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:fd2e631e747c35a95f083418f5829d06c4b830f1fdb322368ff6190b9887ea32" Jan 21 14:44:39 crc kubenswrapper[4720]: E0121 14:44:39.516135 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:fd2e631e747c35a95f083418f5829d06c4b830f1fdb322368ff6190b9887ea32,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7gkdp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-864f6b75bf-n5bwd_openstack-operators(370e5a87-5edf-4d48-9b65-335400a84cd2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:44:39 crc kubenswrapper[4720]: E0121 14:44:39.517540 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-n5bwd" podUID="370e5a87-5edf-4d48-9b65-335400a84cd2" Jan 21 14:44:39 crc kubenswrapper[4720]: E0121 14:44:39.941368 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:fd2e631e747c35a95f083418f5829d06c4b830f1fdb322368ff6190b9887ea32\\\"\"" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-n5bwd" podUID="370e5a87-5edf-4d48-9b65-335400a84cd2" Jan 21 14:44:40 crc kubenswrapper[4720]: E0121 14:44:40.149297 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:8b3bfb9e86618b7ac69443939b0968fae28a22cd62ea1e429b599ff9f8a5f8cf" Jan 21 14:44:40 crc kubenswrapper[4720]: E0121 14:44:40.150092 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:8b3bfb9e86618b7ac69443939b0968fae28a22cd62ea1e429b599ff9f8a5f8cf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mlzcb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-55db956ddc-689zh_openstack-operators(88327b24-ce00-4bb4-98d1-24060c6dbf28): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:44:40 crc kubenswrapper[4720]: E0121 14:44:40.151386 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-689zh" podUID="88327b24-ce00-4bb4-98d1-24060c6dbf28" Jan 21 14:44:40 crc kubenswrapper[4720]: E0121 14:44:40.946382 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:8b3bfb9e86618b7ac69443939b0968fae28a22cd62ea1e429b599ff9f8a5f8cf\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-689zh" podUID="88327b24-ce00-4bb4-98d1-24060c6dbf28" Jan 21 14:44:43 crc kubenswrapper[4720]: I0121 14:44:43.682778 4720 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 14:44:45 crc kubenswrapper[4720]: E0121 14:44:45.270820 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:56c5f8b78445b3dbfc0d5afd9312906f6bef4dccf67302b0e4e5ca20bd263525" Jan 21 14:44:45 crc kubenswrapper[4720]: E0121 14:44:45.271082 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:56c5f8b78445b3dbfc0d5afd9312906f6bef4dccf67302b0e4e5ca20bd263525,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pn2dp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-78757b4889-glbt4_openstack-operators(9b467fa8-1984-4659-8873-99c20204b16b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:44:45 crc kubenswrapper[4720]: E0121 14:44:45.272308 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-glbt4" podUID="9b467fa8-1984-4659-8873-99c20204b16b" Jan 21 14:44:45 crc kubenswrapper[4720]: E0121 14:44:45.851021 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:d69a68cdac59165797daf1064f3a3b4b14b546bf1c7254070a7ed1238998c028" Jan 21 14:44:45 crc kubenswrapper[4720]: E0121 14:44:45.851649 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:d69a68cdac59165797daf1064f3a3b4b14b546bf1c7254070a7ed1238998c028,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7ww4p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-c6994669c-gwlgm_openstack-operators(6c93648a-7076-4d91-ac7a-f389ab1159cc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:44:45 crc kubenswrapper[4720]: E0121 14:44:45.854189 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-c6994669c-gwlgm" podUID="6c93648a-7076-4d91-ac7a-f389ab1159cc" Jan 21 14:44:45 crc kubenswrapper[4720]: E0121 14:44:45.978097 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:d69a68cdac59165797daf1064f3a3b4b14b546bf1c7254070a7ed1238998c028\\\"\"" pod="openstack-operators/glance-operator-controller-manager-c6994669c-gwlgm" podUID="6c93648a-7076-4d91-ac7a-f389ab1159cc" Jan 21 14:44:45 crc kubenswrapper[4720]: E0121 14:44:45.978217 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:56c5f8b78445b3dbfc0d5afd9312906f6bef4dccf67302b0e4e5ca20bd263525\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-glbt4" podUID="9b467fa8-1984-4659-8873-99c20204b16b" Jan 21 14:44:46 crc kubenswrapper[4720]: I0121 14:44:46.008196 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert\") pod \"infra-operator-controller-manager-77c48c7859-xtpbn\" (UID: \"b80cffaf-5853-47ac-b783-c26da64425ff\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn" Jan 21 14:44:46 crc kubenswrapper[4720]: I0121 14:44:46.021404 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert\") pod \"infra-operator-controller-manager-77c48c7859-xtpbn\" (UID: \"b80cffaf-5853-47ac-b783-c26da64425ff\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn" Jan 21 14:44:46 crc kubenswrapper[4720]: I0121 14:44:46.047448 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-zcxwg" Jan 21 14:44:46 crc kubenswrapper[4720]: I0121 14:44:46.057107 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn" Jan 21 14:44:46 crc kubenswrapper[4720]: E0121 14:44:46.454255 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:ab629ec4ce57b5cde9cd6d75069e68edca441b97b7b5a3f58804e2e61766b729" Jan 21 14:44:46 crc kubenswrapper[4720]: E0121 14:44:46.454434 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:ab629ec4ce57b5cde9cd6d75069e68edca441b97b7b5a3f58804e2e61766b729,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9d8th,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-7fc9b76cf6-pw4z6_openstack-operators(9695fd09-d135-426b-a129-66f945d2dd90): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:44:46 crc kubenswrapper[4720]: E0121 14:44:46.455630 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-pw4z6" podUID="9695fd09-d135-426b-a129-66f945d2dd90" Jan 21 14:44:46 crc kubenswrapper[4720]: E0121 14:44:46.991292 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:ab629ec4ce57b5cde9cd6d75069e68edca441b97b7b5a3f58804e2e61766b729\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-pw4z6" podUID="9695fd09-d135-426b-a129-66f945d2dd90" Jan 21 14:44:48 crc kubenswrapper[4720]: E0121 14:44:48.591242 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:0f440bf7dc937ce0135bdd328716686fd2f1320f453a9ac4e11e96383148ad6c" Jan 21 14:44:48 crc kubenswrapper[4720]: E0121 14:44:48.591531 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0f440bf7dc937ce0135bdd328716686fd2f1320f453a9ac4e11e96383148ad6c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5kjvw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-cb4666565-d22bk_openstack-operators(c38df2a4-6626-4b71-9dcd-7ef3003ee693): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:44:48 crc kubenswrapper[4720]: E0121 14:44:48.592705 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-d22bk" podUID="c38df2a4-6626-4b71-9dcd-7ef3003ee693" Jan 21 14:44:48 crc kubenswrapper[4720]: E0121 14:44:48.993867 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:0f440bf7dc937ce0135bdd328716686fd2f1320f453a9ac4e11e96383148ad6c\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-d22bk" podUID="c38df2a4-6626-4b71-9dcd-7ef3003ee693" Jan 21 14:44:49 crc kubenswrapper[4720]: E0121 14:44:49.395943 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:9404536bf7cb7c3818e1a0f92b53e4d7c02fe7942324f32894106f02f8fc7e92" Jan 21 14:44:49 crc kubenswrapper[4720]: E0121 14:44:49.396116 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:9404536bf7cb7c3818e1a0f92b53e4d7c02fe7942324f32894106f02f8fc7e92,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nxxf9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-85dd56d4cc-4tjlt_openstack-operators(a2557af5-c155-4d37-9b9a-f9335cac47b1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:44:49 crc kubenswrapper[4720]: E0121 14:44:49.397338 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-4tjlt" podUID="a2557af5-c155-4d37-9b9a-f9335cac47b1" Jan 21 14:44:49 crc kubenswrapper[4720]: E0121 14:44:49.977332 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:d687150a46d97eb382dcd8305a2a611943af74771debe1fa9cc13a21e51c69ad" Jan 21 14:44:49 crc kubenswrapper[4720]: E0121 14:44:49.977722 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d687150a46d97eb382dcd8305a2a611943af74771debe1fa9cc13a21e51c69ad,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hcwnf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-64cd966744-jfkfq_openstack-operators(de2e9655-961c-4250-9852-332dfe335b4a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:44:49 crc kubenswrapper[4720]: E0121 14:44:49.981511 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-jfkfq" podUID="de2e9655-961c-4250-9852-332dfe335b4a" Jan 21 14:44:54 crc kubenswrapper[4720]: E0121 14:44:54.199554 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:244a4906353b84899db16a89e1ebb64491c9f85e69327cb2a72b6da0142a6e5e" Jan 21 14:44:54 crc kubenswrapper[4720]: E0121 14:44:54.199717 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:244a4906353b84899db16a89e1ebb64491c9f85e69327cb2a72b6da0142a6e5e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mmsgp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-7cd8bc9dbb-xczlv_openstack-operators(cd17e86c-5586-4ea9-979d-2c195494fe99): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:44:54 crc kubenswrapper[4720]: E0121 14:44:54.201378 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-xczlv" podUID="cd17e86c-5586-4ea9-979d-2c195494fe99" Jan 21 14:44:54 crc kubenswrapper[4720]: E0121 14:44:54.729850 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:393d7567eef4fd05af625389f5a7384c6bb75108b21b06183f1f5e33aac5417e" Jan 21 14:44:54 crc kubenswrapper[4720]: E0121 14:44:54.730153 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:393d7567eef4fd05af625389f5a7384c6bb75108b21b06183f1f5e33aac5417e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-m8fx5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-767fdc4f47-54hwg_openstack-operators(085a2e93-1496-47f3-a7dc-4acae2e201fc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:44:54 crc kubenswrapper[4720]: E0121 14:44:54.731303 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-54hwg" podUID="085a2e93-1496-47f3-a7dc-4acae2e201fc" Jan 21 14:44:55 crc kubenswrapper[4720]: E0121 14:44:55.037392 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:393d7567eef4fd05af625389f5a7384c6bb75108b21b06183f1f5e33aac5417e\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-54hwg" podUID="085a2e93-1496-47f3-a7dc-4acae2e201fc" Jan 21 14:44:55 crc kubenswrapper[4720]: E0121 14:44:55.104183 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Jan 21 14:44:55 crc kubenswrapper[4720]: E0121 14:44:55.104344 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-d2zp6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-mm7cg_openstack-operators(8db4bced-5679-43ab-a5c9-ba87574aaa02): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:44:55 crc kubenswrapper[4720]: E0121 14:44:55.105562 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mm7cg" podUID="8db4bced-5679-43ab-a5c9-ba87574aaa02" Jan 21 14:44:55 crc kubenswrapper[4720]: I0121 14:44:55.769824 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr"] Jan 21 14:44:55 crc kubenswrapper[4720]: I0121 14:44:55.906681 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw"] Jan 21 14:44:55 crc kubenswrapper[4720]: I0121 14:44:55.994901 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn"] Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.043072 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-wnzfm" event={"ID":"b7ea6739-9c38-44a0-a382-8b26e37138fa","Type":"ContainerStarted","Data":"25dc0e69a9ca5bfa199a30b67aa5e77dece5373b5796de32a87b545138d95826"} Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.043290 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-wnzfm" Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.044954 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-65849867d6-vzzmp" event={"ID":"bb4052fb-6d6d-4e2a-b2f7-ee94c97512b5","Type":"ContainerStarted","Data":"02714d73a06c1ab2e0e827cee30936b4ad1bf1a5b5f5f7c174e19e6bb1418af1"} Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.045134 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-65849867d6-vzzmp" Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.048067 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-2clln" event={"ID":"18ce7f0d-00de-4a92-97f2-743d9057abff","Type":"ContainerStarted","Data":"039b7dd4a95f9875eb3baf220815896626a7694bb2e2236faadabdbf7a8345f0"} Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.048514 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-2clln" Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.052504 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" event={"ID":"eb81b686-832a-414b-aa66-cf40a72a7427","Type":"ContainerStarted","Data":"edb99a7ba20c3db307a55145637fcd23c1ea76a16d7e92449ca0d659fedbf714"} Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.053741 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-n5bwd" event={"ID":"370e5a87-5edf-4d48-9b65-335400a84cd2","Type":"ContainerStarted","Data":"115ea91d56873cb47c112be05820d5378f2f171bcf58765f5ce91b4a2f851cd9"} Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.053942 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-n5bwd" Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.055199 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn" event={"ID":"b80cffaf-5853-47ac-b783-c26da64425ff","Type":"ContainerStarted","Data":"317ba5688fe50672c7df2a62fda41a00f31d58d621aa4f3411f6bf91cf1548c5"} Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.057484 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" event={"ID":"88e81fdb-6501-410c-9452-d3ba7f41a30d","Type":"ContainerStarted","Data":"b7814ceac5f9c7ed17ca9501dbd04b92f686ec24c1a5d0d24cdbf4eb92168380"} Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.069948 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-wnzfm" podStartSLOduration=2.982712792 podStartE2EDuration="43.069928208s" podCreationTimestamp="2026-01-21 14:44:13 +0000 UTC" firstStartedPulling="2026-01-21 14:44:15.123853839 +0000 UTC m=+893.032593761" lastFinishedPulling="2026-01-21 14:44:55.211069225 +0000 UTC m=+933.119809177" observedRunningTime="2026-01-21 14:44:56.061858608 +0000 UTC m=+933.970598540" watchObservedRunningTime="2026-01-21 14:44:56.069928208 +0000 UTC m=+933.978668140" Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.071374 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-q2t2m" event={"ID":"655f8c6a-4936-45d3-9538-66ee77a050d3","Type":"ContainerStarted","Data":"dd8a820b6f770e56cc5f4fb8508a7ff4c0a5b576c11c4c670f11ff60076c068e"} Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.072072 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-q2t2m" Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.078584 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-vfxfh" event={"ID":"071d4469-5b09-49a3-97f4-239d811825a2","Type":"ContainerStarted","Data":"bff2b48d1274a26c4af5723d29cc757b4ea18436a4199cc6eba12a248bc8a98a"} Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.079200 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-vfxfh" Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.090143 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-bl4z8" event={"ID":"9a5569f7-371f-4663-b005-5fdcce36936b","Type":"ContainerStarted","Data":"c1139b85e5ee1a71aef08278791cf53cf15521556580d8333fc6e5443dc4344b"} Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.090319 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-bl4z8" Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.091588 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-v4fbm" event={"ID":"589a442f-27a6-4d23-85dd-9e5b1556363f","Type":"ContainerStarted","Data":"9fe05ff653c035141be9155e808b3735cd2dc809ce42e617e65f99a4b72de9a4"} Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.092244 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-v4fbm" Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.093642 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-9f958b845-bjn2r" event={"ID":"96218341-1cf7-4aa1-bb9a-7a7abba7a93e","Type":"ContainerStarted","Data":"88e33885539513554b4b6e245a5a208386045d6f526ab5a4f2b699eff92ffbd3"} Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.094012 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-9f958b845-bjn2r" Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.095018 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-8hrkh" event={"ID":"a050e31c-3d6d-490c-8f74-637c37c96a5e","Type":"ContainerStarted","Data":"a556a5f3bb89df39fb8dd1f31bbf4c28e1d6092323d424ab58c60dceb82bb1fa"} Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.095360 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-8hrkh" Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.112831 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-2clln" podStartSLOduration=3.128621411 podStartE2EDuration="42.112809296s" podCreationTimestamp="2026-01-21 14:44:14 +0000 UTC" firstStartedPulling="2026-01-21 14:44:16.225438451 +0000 UTC m=+894.134178383" lastFinishedPulling="2026-01-21 14:44:55.209626326 +0000 UTC m=+933.118366268" observedRunningTime="2026-01-21 14:44:56.112529318 +0000 UTC m=+934.021269250" watchObservedRunningTime="2026-01-21 14:44:56.112809296 +0000 UTC m=+934.021549228" Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.114556 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-n5bwd" podStartSLOduration=3.070700744 podStartE2EDuration="42.114546693s" podCreationTimestamp="2026-01-21 14:44:14 +0000 UTC" firstStartedPulling="2026-01-21 14:44:16.225018429 +0000 UTC m=+894.133758361" lastFinishedPulling="2026-01-21 14:44:55.268864378 +0000 UTC m=+933.177604310" observedRunningTime="2026-01-21 14:44:56.08544808 +0000 UTC m=+933.994188012" watchObservedRunningTime="2026-01-21 14:44:56.114546693 +0000 UTC m=+934.023286655" Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.136045 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-65849867d6-vzzmp" podStartSLOduration=3.026714726 podStartE2EDuration="42.136028767s" podCreationTimestamp="2026-01-21 14:44:14 +0000 UTC" firstStartedPulling="2026-01-21 14:44:16.181867855 +0000 UTC m=+894.090607787" lastFinishedPulling="2026-01-21 14:44:55.291181896 +0000 UTC m=+933.199921828" observedRunningTime="2026-01-21 14:44:56.135237316 +0000 UTC m=+934.043977248" watchObservedRunningTime="2026-01-21 14:44:56.136028767 +0000 UTC m=+934.044768699" Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.159513 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-v4fbm" podStartSLOduration=2.880504626 podStartE2EDuration="42.159491936s" podCreationTimestamp="2026-01-21 14:44:14 +0000 UTC" firstStartedPulling="2026-01-21 14:44:15.932874726 +0000 UTC m=+893.841614658" lastFinishedPulling="2026-01-21 14:44:55.211862036 +0000 UTC m=+933.120601968" observedRunningTime="2026-01-21 14:44:56.154816139 +0000 UTC m=+934.063556081" watchObservedRunningTime="2026-01-21 14:44:56.159491936 +0000 UTC m=+934.068231868" Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.242505 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-vfxfh" podStartSLOduration=6.613191276 podStartE2EDuration="43.242483146s" podCreationTimestamp="2026-01-21 14:44:13 +0000 UTC" firstStartedPulling="2026-01-21 14:44:15.989812595 +0000 UTC m=+893.898552527" lastFinishedPulling="2026-01-21 14:44:52.619104425 +0000 UTC m=+930.527844397" observedRunningTime="2026-01-21 14:44:56.239817653 +0000 UTC m=+934.148557585" watchObservedRunningTime="2026-01-21 14:44:56.242483146 +0000 UTC m=+934.151223078" Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.304369 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-bl4z8" podStartSLOduration=4.070653413 podStartE2EDuration="43.304351881s" podCreationTimestamp="2026-01-21 14:44:13 +0000 UTC" firstStartedPulling="2026-01-21 14:44:16.060394287 +0000 UTC m=+893.969134219" lastFinishedPulling="2026-01-21 14:44:55.294092755 +0000 UTC m=+933.202832687" observedRunningTime="2026-01-21 14:44:56.299696383 +0000 UTC m=+934.208436315" watchObservedRunningTime="2026-01-21 14:44:56.304351881 +0000 UTC m=+934.213091813" Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.305200 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-8hrkh" podStartSLOduration=3.389212635 podStartE2EDuration="42.305194183s" podCreationTimestamp="2026-01-21 14:44:14 +0000 UTC" firstStartedPulling="2026-01-21 14:44:16.351499293 +0000 UTC m=+894.260239225" lastFinishedPulling="2026-01-21 14:44:55.267480841 +0000 UTC m=+933.176220773" observedRunningTime="2026-01-21 14:44:56.274852727 +0000 UTC m=+934.183592659" watchObservedRunningTime="2026-01-21 14:44:56.305194183 +0000 UTC m=+934.213934105" Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.358343 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-q2t2m" podStartSLOduration=3.595905486 podStartE2EDuration="43.35832351s" podCreationTimestamp="2026-01-21 14:44:13 +0000 UTC" firstStartedPulling="2026-01-21 14:44:15.51581347 +0000 UTC m=+893.424553402" lastFinishedPulling="2026-01-21 14:44:55.278231494 +0000 UTC m=+933.186971426" observedRunningTime="2026-01-21 14:44:56.32896214 +0000 UTC m=+934.237702072" watchObservedRunningTime="2026-01-21 14:44:56.35832351 +0000 UTC m=+934.267063432" Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.360521 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-9f958b845-bjn2r" podStartSLOduration=6.27907441 podStartE2EDuration="43.36051636s" podCreationTimestamp="2026-01-21 14:44:13 +0000 UTC" firstStartedPulling="2026-01-21 14:44:15.5374658 +0000 UTC m=+893.446205732" lastFinishedPulling="2026-01-21 14:44:52.61890772 +0000 UTC m=+930.527647682" observedRunningTime="2026-01-21 14:44:56.353630172 +0000 UTC m=+934.262370104" watchObservedRunningTime="2026-01-21 14:44:56.36051636 +0000 UTC m=+934.269256292" Jan 21 14:44:57 crc kubenswrapper[4720]: I0121 14:44:57.106705 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" event={"ID":"eb81b686-832a-414b-aa66-cf40a72a7427","Type":"ContainerStarted","Data":"19ff05a60a49d9a47c6a2ed4ccfca59971a386247e464f0f2626c2650e73f180"} Jan 21 14:44:57 crc kubenswrapper[4720]: I0121 14:44:57.196076 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" podStartSLOduration=43.196056138 podStartE2EDuration="43.196056138s" podCreationTimestamp="2026-01-21 14:44:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:44:57.179813016 +0000 UTC m=+935.088552958" watchObservedRunningTime="2026-01-21 14:44:57.196056138 +0000 UTC m=+935.104796070" Jan 21 14:44:58 crc kubenswrapper[4720]: I0121 14:44:58.117193 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-689zh" event={"ID":"88327b24-ce00-4bb4-98d1-24060c6dbf28","Type":"ContainerStarted","Data":"7fd2f9a820b98ec31d5d549fa09baa4a93071ad8a796273e83096f99e20d28d1"} Jan 21 14:44:58 crc kubenswrapper[4720]: I0121 14:44:58.117535 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:58 crc kubenswrapper[4720]: I0121 14:44:58.138311 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-689zh" podStartSLOduration=3.021170835 podStartE2EDuration="44.138295071s" podCreationTimestamp="2026-01-21 14:44:14 +0000 UTC" firstStartedPulling="2026-01-21 14:44:16.14166672 +0000 UTC m=+894.050406652" lastFinishedPulling="2026-01-21 14:44:57.258790956 +0000 UTC m=+935.167530888" observedRunningTime="2026-01-21 14:44:58.133819449 +0000 UTC m=+936.042559381" watchObservedRunningTime="2026-01-21 14:44:58.138295071 +0000 UTC m=+936.047035003" Jan 21 14:44:59 crc kubenswrapper[4720]: E0121 14:44:59.681565 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:9404536bf7cb7c3818e1a0f92b53e4d7c02fe7942324f32894106f02f8fc7e92\\\"\"" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-4tjlt" podUID="a2557af5-c155-4d37-9b9a-f9335cac47b1" Jan 21 14:45:00 crc kubenswrapper[4720]: I0121 14:45:00.175113 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg"] Jan 21 14:45:00 crc kubenswrapper[4720]: I0121 14:45:00.177834 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg" Jan 21 14:45:00 crc kubenswrapper[4720]: I0121 14:45:00.182101 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 21 14:45:00 crc kubenswrapper[4720]: I0121 14:45:00.182175 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 21 14:45:00 crc kubenswrapper[4720]: I0121 14:45:00.193017 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg"] Jan 21 14:45:00 crc kubenswrapper[4720]: I0121 14:45:00.320970 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/56142c27-a9f6-4617-ad00-0d1cd7416732-config-volume\") pod \"collect-profiles-29483445-s84lg\" (UID: \"56142c27-a9f6-4617-ad00-0d1cd7416732\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg" Jan 21 14:45:00 crc kubenswrapper[4720]: I0121 14:45:00.321004 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/56142c27-a9f6-4617-ad00-0d1cd7416732-secret-volume\") pod \"collect-profiles-29483445-s84lg\" (UID: \"56142c27-a9f6-4617-ad00-0d1cd7416732\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg" Jan 21 14:45:00 crc kubenswrapper[4720]: I0121 14:45:00.321041 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwvxr\" (UniqueName: \"kubernetes.io/projected/56142c27-a9f6-4617-ad00-0d1cd7416732-kube-api-access-cwvxr\") pod \"collect-profiles-29483445-s84lg\" (UID: \"56142c27-a9f6-4617-ad00-0d1cd7416732\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg" Jan 21 14:45:00 crc kubenswrapper[4720]: I0121 14:45:00.422012 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/56142c27-a9f6-4617-ad00-0d1cd7416732-config-volume\") pod \"collect-profiles-29483445-s84lg\" (UID: \"56142c27-a9f6-4617-ad00-0d1cd7416732\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg" Jan 21 14:45:00 crc kubenswrapper[4720]: I0121 14:45:00.422055 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/56142c27-a9f6-4617-ad00-0d1cd7416732-secret-volume\") pod \"collect-profiles-29483445-s84lg\" (UID: \"56142c27-a9f6-4617-ad00-0d1cd7416732\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg" Jan 21 14:45:00 crc kubenswrapper[4720]: I0121 14:45:00.422095 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwvxr\" (UniqueName: \"kubernetes.io/projected/56142c27-a9f6-4617-ad00-0d1cd7416732-kube-api-access-cwvxr\") pod \"collect-profiles-29483445-s84lg\" (UID: \"56142c27-a9f6-4617-ad00-0d1cd7416732\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg" Jan 21 14:45:00 crc kubenswrapper[4720]: I0121 14:45:00.422976 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/56142c27-a9f6-4617-ad00-0d1cd7416732-config-volume\") pod \"collect-profiles-29483445-s84lg\" (UID: \"56142c27-a9f6-4617-ad00-0d1cd7416732\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg" Jan 21 14:45:00 crc kubenswrapper[4720]: I0121 14:45:00.430060 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/56142c27-a9f6-4617-ad00-0d1cd7416732-secret-volume\") pod \"collect-profiles-29483445-s84lg\" (UID: \"56142c27-a9f6-4617-ad00-0d1cd7416732\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg" Jan 21 14:45:00 crc kubenswrapper[4720]: I0121 14:45:00.440763 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwvxr\" (UniqueName: \"kubernetes.io/projected/56142c27-a9f6-4617-ad00-0d1cd7416732-kube-api-access-cwvxr\") pod \"collect-profiles-29483445-s84lg\" (UID: \"56142c27-a9f6-4617-ad00-0d1cd7416732\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg" Jan 21 14:45:00 crc kubenswrapper[4720]: I0121 14:45:00.505111 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg" Jan 21 14:45:00 crc kubenswrapper[4720]: E0121 14:45:00.730350 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d687150a46d97eb382dcd8305a2a611943af74771debe1fa9cc13a21e51c69ad\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-jfkfq" podUID="de2e9655-961c-4250-9852-332dfe335b4a" Jan 21 14:45:01 crc kubenswrapper[4720]: I0121 14:45:01.135598 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-glbt4" event={"ID":"9b467fa8-1984-4659-8873-99c20204b16b","Type":"ContainerStarted","Data":"1bfcc1dd03888aef23c8e6b3d57950f247f7c32e68f8977a1c9da1be4fc97a20"} Jan 21 14:45:01 crc kubenswrapper[4720]: I0121 14:45:01.136093 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-glbt4" Jan 21 14:45:01 crc kubenswrapper[4720]: I0121 14:45:01.138813 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-d22bk" event={"ID":"c38df2a4-6626-4b71-9dcd-7ef3003ee693","Type":"ContainerStarted","Data":"a77758167e47a97957508571a71924682fe25a9b67a056bc8fac33da30ae5c19"} Jan 21 14:45:01 crc kubenswrapper[4720]: I0121 14:45:01.139341 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-d22bk" Jan 21 14:45:01 crc kubenswrapper[4720]: I0121 14:45:01.141000 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-c6994669c-gwlgm" event={"ID":"6c93648a-7076-4d91-ac7a-f389ab1159cc","Type":"ContainerStarted","Data":"fbc6fca7434cef1e381fd6d87885e6a980bd13602aa13f1164869ad731a0e6d4"} Jan 21 14:45:01 crc kubenswrapper[4720]: I0121 14:45:01.141161 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-c6994669c-gwlgm" Jan 21 14:45:01 crc kubenswrapper[4720]: I0121 14:45:01.144378 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn" event={"ID":"b80cffaf-5853-47ac-b783-c26da64425ff","Type":"ContainerStarted","Data":"5e5cb687a7f92716f2cb25340a10dc99596dca88bc80c02499ffd1f7c4b6cfea"} Jan 21 14:45:01 crc kubenswrapper[4720]: I0121 14:45:01.144543 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn" Jan 21 14:45:01 crc kubenswrapper[4720]: I0121 14:45:01.146333 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-pw4z6" event={"ID":"9695fd09-d135-426b-a129-66f945d2dd90","Type":"ContainerStarted","Data":"759f94efc0cae9fd026c4ad9faef3e14ae101ea44c8e128b12e4ab140e7f5779"} Jan 21 14:45:01 crc kubenswrapper[4720]: I0121 14:45:01.146863 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-pw4z6" Jan 21 14:45:01 crc kubenswrapper[4720]: I0121 14:45:01.148170 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" event={"ID":"88e81fdb-6501-410c-9452-d3ba7f41a30d","Type":"ContainerStarted","Data":"4221bf11b4cb7619f41f6fcea41e7817f6ec752ba991e56eab430bbf275c95e3"} Jan 21 14:45:01 crc kubenswrapper[4720]: I0121 14:45:01.148604 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" Jan 21 14:45:01 crc kubenswrapper[4720]: I0121 14:45:01.158890 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-glbt4" podStartSLOduration=3.45389838 podStartE2EDuration="48.15886659s" podCreationTimestamp="2026-01-21 14:44:13 +0000 UTC" firstStartedPulling="2026-01-21 14:44:16.099507772 +0000 UTC m=+894.008247704" lastFinishedPulling="2026-01-21 14:45:00.804475982 +0000 UTC m=+938.713215914" observedRunningTime="2026-01-21 14:45:01.151912171 +0000 UTC m=+939.060652123" watchObservedRunningTime="2026-01-21 14:45:01.15886659 +0000 UTC m=+939.067606542" Jan 21 14:45:01 crc kubenswrapper[4720]: I0121 14:45:01.179904 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-d22bk" podStartSLOduration=2.47477383 podStartE2EDuration="47.179883053s" podCreationTimestamp="2026-01-21 14:44:14 +0000 UTC" firstStartedPulling="2026-01-21 14:44:16.09940824 +0000 UTC m=+894.008148172" lastFinishedPulling="2026-01-21 14:45:00.804517463 +0000 UTC m=+938.713257395" observedRunningTime="2026-01-21 14:45:01.172067479 +0000 UTC m=+939.080807421" watchObservedRunningTime="2026-01-21 14:45:01.179883053 +0000 UTC m=+939.088622995" Jan 21 14:45:01 crc kubenswrapper[4720]: I0121 14:45:01.220220 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-pw4z6" podStartSLOduration=2.534013172 podStartE2EDuration="47.22020451s" podCreationTimestamp="2026-01-21 14:44:14 +0000 UTC" firstStartedPulling="2026-01-21 14:44:16.113441742 +0000 UTC m=+894.022181684" lastFinishedPulling="2026-01-21 14:45:00.79963309 +0000 UTC m=+938.708373022" observedRunningTime="2026-01-21 14:45:01.202592971 +0000 UTC m=+939.111332923" watchObservedRunningTime="2026-01-21 14:45:01.22020451 +0000 UTC m=+939.128944442" Jan 21 14:45:01 crc kubenswrapper[4720]: I0121 14:45:01.221424 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg"] Jan 21 14:45:01 crc kubenswrapper[4720]: I0121 14:45:01.222184 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-c6994669c-gwlgm" podStartSLOduration=2.714369686 podStartE2EDuration="48.222178034s" podCreationTimestamp="2026-01-21 14:44:13 +0000 UTC" firstStartedPulling="2026-01-21 14:44:15.274272754 +0000 UTC m=+893.183012686" lastFinishedPulling="2026-01-21 14:45:00.782081102 +0000 UTC m=+938.690821034" observedRunningTime="2026-01-21 14:45:01.219217713 +0000 UTC m=+939.127957655" watchObservedRunningTime="2026-01-21 14:45:01.222178034 +0000 UTC m=+939.130917966" Jan 21 14:45:01 crc kubenswrapper[4720]: W0121 14:45:01.235568 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56142c27_a9f6_4617_ad00_0d1cd7416732.slice/crio-dbbdc6cd4df54810bc28de491c3502b247e8ff570875b1b67219043c80a4af0e WatchSource:0}: Error finding container dbbdc6cd4df54810bc28de491c3502b247e8ff570875b1b67219043c80a4af0e: Status 404 returned error can't find the container with id dbbdc6cd4df54810bc28de491c3502b247e8ff570875b1b67219043c80a4af0e Jan 21 14:45:01 crc kubenswrapper[4720]: I0121 14:45:01.254215 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn" podStartSLOduration=43.471312606 podStartE2EDuration="48.254166125s" podCreationTimestamp="2026-01-21 14:44:13 +0000 UTC" firstStartedPulling="2026-01-21 14:44:56.016488853 +0000 UTC m=+933.925228785" lastFinishedPulling="2026-01-21 14:45:00.799342372 +0000 UTC m=+938.708082304" observedRunningTime="2026-01-21 14:45:01.251117602 +0000 UTC m=+939.159857554" watchObservedRunningTime="2026-01-21 14:45:01.254166125 +0000 UTC m=+939.162906067" Jan 21 14:45:01 crc kubenswrapper[4720]: I0121 14:45:01.324635 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" podStartSLOduration=42.507523081 podStartE2EDuration="47.324614563s" podCreationTimestamp="2026-01-21 14:44:14 +0000 UTC" firstStartedPulling="2026-01-21 14:44:55.987654888 +0000 UTC m=+933.896409880" lastFinishedPulling="2026-01-21 14:45:00.80476143 +0000 UTC m=+938.713501362" observedRunningTime="2026-01-21 14:45:01.308885635 +0000 UTC m=+939.217625577" watchObservedRunningTime="2026-01-21 14:45:01.324614563 +0000 UTC m=+939.233354495" Jan 21 14:45:01 crc kubenswrapper[4720]: I0121 14:45:01.481703 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:45:02 crc kubenswrapper[4720]: I0121 14:45:02.157326 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg" event={"ID":"56142c27-a9f6-4617-ad00-0d1cd7416732","Type":"ContainerStarted","Data":"a185df7caa4972969a59754656c2e094afde3b9115ea7c3b115437cf0a6c85a1"} Jan 21 14:45:02 crc kubenswrapper[4720]: I0121 14:45:02.157605 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg" event={"ID":"56142c27-a9f6-4617-ad00-0d1cd7416732","Type":"ContainerStarted","Data":"dbbdc6cd4df54810bc28de491c3502b247e8ff570875b1b67219043c80a4af0e"} Jan 21 14:45:02 crc kubenswrapper[4720]: I0121 14:45:02.179168 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg" podStartSLOduration=2.179150348 podStartE2EDuration="2.179150348s" podCreationTimestamp="2026-01-21 14:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:45:02.174211294 +0000 UTC m=+940.082951236" watchObservedRunningTime="2026-01-21 14:45:02.179150348 +0000 UTC m=+940.087890280" Jan 21 14:45:03 crc kubenswrapper[4720]: I0121 14:45:03.164354 4720 generic.go:334] "Generic (PLEG): container finished" podID="56142c27-a9f6-4617-ad00-0d1cd7416732" containerID="a185df7caa4972969a59754656c2e094afde3b9115ea7c3b115437cf0a6c85a1" exitCode=0 Jan 21 14:45:03 crc kubenswrapper[4720]: I0121 14:45:03.164414 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg" event={"ID":"56142c27-a9f6-4617-ad00-0d1cd7416732","Type":"ContainerDied","Data":"a185df7caa4972969a59754656c2e094afde3b9115ea7c3b115437cf0a6c85a1"} Jan 21 14:45:03 crc kubenswrapper[4720]: I0121 14:45:03.866968 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-wnzfm" Jan 21 14:45:03 crc kubenswrapper[4720]: I0121 14:45:03.932155 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-9f958b845-bjn2r" Jan 21 14:45:03 crc kubenswrapper[4720]: I0121 14:45:03.934083 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-q2t2m" Jan 21 14:45:04 crc kubenswrapper[4720]: I0121 14:45:04.066721 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-vfxfh" Jan 21 14:45:04 crc kubenswrapper[4720]: I0121 14:45:04.377639 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-bl4z8" Jan 21 14:45:04 crc kubenswrapper[4720]: I0121 14:45:04.408347 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-v4fbm" Jan 21 14:45:04 crc kubenswrapper[4720]: I0121 14:45:04.464197 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg" Jan 21 14:45:04 crc kubenswrapper[4720]: I0121 14:45:04.581953 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/56142c27-a9f6-4617-ad00-0d1cd7416732-secret-volume\") pod \"56142c27-a9f6-4617-ad00-0d1cd7416732\" (UID: \"56142c27-a9f6-4617-ad00-0d1cd7416732\") " Jan 21 14:45:04 crc kubenswrapper[4720]: I0121 14:45:04.582304 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/56142c27-a9f6-4617-ad00-0d1cd7416732-config-volume\") pod \"56142c27-a9f6-4617-ad00-0d1cd7416732\" (UID: \"56142c27-a9f6-4617-ad00-0d1cd7416732\") " Jan 21 14:45:04 crc kubenswrapper[4720]: I0121 14:45:04.582338 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwvxr\" (UniqueName: \"kubernetes.io/projected/56142c27-a9f6-4617-ad00-0d1cd7416732-kube-api-access-cwvxr\") pod \"56142c27-a9f6-4617-ad00-0d1cd7416732\" (UID: \"56142c27-a9f6-4617-ad00-0d1cd7416732\") " Jan 21 14:45:04 crc kubenswrapper[4720]: I0121 14:45:04.582821 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56142c27-a9f6-4617-ad00-0d1cd7416732-config-volume" (OuterVolumeSpecName: "config-volume") pod "56142c27-a9f6-4617-ad00-0d1cd7416732" (UID: "56142c27-a9f6-4617-ad00-0d1cd7416732"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:45:04 crc kubenswrapper[4720]: I0121 14:45:04.587053 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56142c27-a9f6-4617-ad00-0d1cd7416732-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "56142c27-a9f6-4617-ad00-0d1cd7416732" (UID: "56142c27-a9f6-4617-ad00-0d1cd7416732"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:45:04 crc kubenswrapper[4720]: I0121 14:45:04.603033 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56142c27-a9f6-4617-ad00-0d1cd7416732-kube-api-access-cwvxr" (OuterVolumeSpecName: "kube-api-access-cwvxr") pod "56142c27-a9f6-4617-ad00-0d1cd7416732" (UID: "56142c27-a9f6-4617-ad00-0d1cd7416732"). InnerVolumeSpecName "kube-api-access-cwvxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:45:04 crc kubenswrapper[4720]: I0121 14:45:04.684416 4720 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/56142c27-a9f6-4617-ad00-0d1cd7416732-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 14:45:04 crc kubenswrapper[4720]: I0121 14:45:04.684454 4720 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/56142c27-a9f6-4617-ad00-0d1cd7416732-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 14:45:04 crc kubenswrapper[4720]: I0121 14:45:04.684466 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwvxr\" (UniqueName: \"kubernetes.io/projected/56142c27-a9f6-4617-ad00-0d1cd7416732-kube-api-access-cwvxr\") on node \"crc\" DevicePath \"\"" Jan 21 14:45:04 crc kubenswrapper[4720]: I0121 14:45:04.746233 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-65849867d6-vzzmp" Jan 21 14:45:04 crc kubenswrapper[4720]: I0121 14:45:04.760419 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-n5bwd" Jan 21 14:45:04 crc kubenswrapper[4720]: I0121 14:45:04.928875 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-689zh" Jan 21 14:45:04 crc kubenswrapper[4720]: I0121 14:45:04.932031 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-689zh" Jan 21 14:45:05 crc kubenswrapper[4720]: I0121 14:45:05.013474 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-2clln" Jan 21 14:45:05 crc kubenswrapper[4720]: I0121 14:45:05.178456 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg" event={"ID":"56142c27-a9f6-4617-ad00-0d1cd7416732","Type":"ContainerDied","Data":"dbbdc6cd4df54810bc28de491c3502b247e8ff570875b1b67219043c80a4af0e"} Jan 21 14:45:05 crc kubenswrapper[4720]: I0121 14:45:05.178527 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbbdc6cd4df54810bc28de491c3502b247e8ff570875b1b67219043c80a4af0e" Jan 21 14:45:05 crc kubenswrapper[4720]: I0121 14:45:05.178475 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg" Jan 21 14:45:05 crc kubenswrapper[4720]: I0121 14:45:05.299126 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-8hrkh" Jan 21 14:45:05 crc kubenswrapper[4720]: E0121 14:45:05.679345 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mm7cg" podUID="8db4bced-5679-43ab-a5c9-ba87574aaa02" Jan 21 14:45:06 crc kubenswrapper[4720]: I0121 14:45:06.065516 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn" Jan 21 14:45:09 crc kubenswrapper[4720]: E0121 14:45:09.679872 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:244a4906353b84899db16a89e1ebb64491c9f85e69327cb2a72b6da0142a6e5e\\\"\"" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-xczlv" podUID="cd17e86c-5586-4ea9-979d-2c195494fe99" Jan 21 14:45:10 crc kubenswrapper[4720]: I0121 14:45:10.213236 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-54hwg" event={"ID":"085a2e93-1496-47f3-a7dc-4acae2e201fc","Type":"ContainerStarted","Data":"6e3ab235b1a036848d728d8093859634b696ed3cd06458985b0bd31ee1226158"} Jan 21 14:45:10 crc kubenswrapper[4720]: I0121 14:45:10.213972 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-54hwg" Jan 21 14:45:10 crc kubenswrapper[4720]: I0121 14:45:10.240620 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-54hwg" podStartSLOduration=3.665598445 podStartE2EDuration="57.240597509s" podCreationTimestamp="2026-01-21 14:44:13 +0000 UTC" firstStartedPulling="2026-01-21 14:44:16.026386602 +0000 UTC m=+893.935126534" lastFinishedPulling="2026-01-21 14:45:09.601385666 +0000 UTC m=+947.510125598" observedRunningTime="2026-01-21 14:45:10.234608566 +0000 UTC m=+948.143348518" watchObservedRunningTime="2026-01-21 14:45:10.240597509 +0000 UTC m=+948.149337461" Jan 21 14:45:10 crc kubenswrapper[4720]: I0121 14:45:10.771529 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" Jan 21 14:45:13 crc kubenswrapper[4720]: I0121 14:45:13.961174 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-c6994669c-gwlgm" Jan 21 14:45:14 crc kubenswrapper[4720]: I0121 14:45:14.240735 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-jfkfq" event={"ID":"de2e9655-961c-4250-9852-332dfe335b4a","Type":"ContainerStarted","Data":"50e4ac158e03d1576d0b0021ba97e23ccd5940edf9ab87017ac66602fdea9014"} Jan 21 14:45:14 crc kubenswrapper[4720]: I0121 14:45:14.241718 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-jfkfq" Jan 21 14:45:14 crc kubenswrapper[4720]: I0121 14:45:14.378420 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-54hwg" Jan 21 14:45:14 crc kubenswrapper[4720]: I0121 14:45:14.401193 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-jfkfq" podStartSLOduration=3.510243511 podStartE2EDuration="1m0.401161515s" podCreationTimestamp="2026-01-21 14:44:14 +0000 UTC" firstStartedPulling="2026-01-21 14:44:16.24340628 +0000 UTC m=+894.152146212" lastFinishedPulling="2026-01-21 14:45:13.134324284 +0000 UTC m=+951.043064216" observedRunningTime="2026-01-21 14:45:14.260982188 +0000 UTC m=+952.169722150" watchObservedRunningTime="2026-01-21 14:45:14.401161515 +0000 UTC m=+952.309901467" Jan 21 14:45:14 crc kubenswrapper[4720]: I0121 14:45:14.589529 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-d22bk" Jan 21 14:45:14 crc kubenswrapper[4720]: I0121 14:45:14.599229 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-glbt4" Jan 21 14:45:14 crc kubenswrapper[4720]: I0121 14:45:14.726078 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-pw4z6" Jan 21 14:45:16 crc kubenswrapper[4720]: I0121 14:45:16.256614 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-4tjlt" event={"ID":"a2557af5-c155-4d37-9b9a-f9335cac47b1","Type":"ContainerStarted","Data":"a98414e7ded5d377094a8e8b855ce704262508c6acaa7327778376dc0c31fd50"} Jan 21 14:45:16 crc kubenswrapper[4720]: I0121 14:45:16.257910 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-4tjlt" Jan 21 14:45:16 crc kubenswrapper[4720]: I0121 14:45:16.310133 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-4tjlt" podStartSLOduration=2.527852374 podStartE2EDuration="1m2.310111138s" podCreationTimestamp="2026-01-21 14:44:14 +0000 UTC" firstStartedPulling="2026-01-21 14:44:16.235134095 +0000 UTC m=+894.143874027" lastFinishedPulling="2026-01-21 14:45:16.017392859 +0000 UTC m=+953.926132791" observedRunningTime="2026-01-21 14:45:16.305374829 +0000 UTC m=+954.214114761" watchObservedRunningTime="2026-01-21 14:45:16.310111138 +0000 UTC m=+954.218851080" Jan 21 14:45:19 crc kubenswrapper[4720]: I0121 14:45:19.278796 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mm7cg" event={"ID":"8db4bced-5679-43ab-a5c9-ba87574aaa02","Type":"ContainerStarted","Data":"28813b3446ff07ba2908d4dde7287b08c5bec53016fc7df8edc5b7f7b4820c22"} Jan 21 14:45:19 crc kubenswrapper[4720]: I0121 14:45:19.295892 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mm7cg" podStartSLOduration=2.335237476 podStartE2EDuration="1m4.295871358s" podCreationTimestamp="2026-01-21 14:44:15 +0000 UTC" firstStartedPulling="2026-01-21 14:44:16.383304958 +0000 UTC m=+894.292044890" lastFinishedPulling="2026-01-21 14:45:18.34393883 +0000 UTC m=+956.252678772" observedRunningTime="2026-01-21 14:45:19.294166032 +0000 UTC m=+957.202906014" watchObservedRunningTime="2026-01-21 14:45:19.295871358 +0000 UTC m=+957.204611290" Jan 21 14:45:22 crc kubenswrapper[4720]: I0121 14:45:22.310083 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-xczlv" event={"ID":"cd17e86c-5586-4ea9-979d-2c195494fe99","Type":"ContainerStarted","Data":"c3e105273f6dfcd33639148919e2dc67f7d144de9f53263b410bebaf143572db"} Jan 21 14:45:23 crc kubenswrapper[4720]: I0121 14:45:23.318207 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-xczlv" Jan 21 14:45:23 crc kubenswrapper[4720]: I0121 14:45:23.339211 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-xczlv" podStartSLOduration=3.576139275 podStartE2EDuration="1m9.339192002s" podCreationTimestamp="2026-01-21 14:44:14 +0000 UTC" firstStartedPulling="2026-01-21 14:44:16.363291774 +0000 UTC m=+894.272031706" lastFinishedPulling="2026-01-21 14:45:22.126344501 +0000 UTC m=+960.035084433" observedRunningTime="2026-01-21 14:45:23.333884937 +0000 UTC m=+961.242624879" watchObservedRunningTime="2026-01-21 14:45:23.339192002 +0000 UTC m=+961.247931944" Jan 21 14:45:24 crc kubenswrapper[4720]: I0121 14:45:24.989386 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-4tjlt" Jan 21 14:45:25 crc kubenswrapper[4720]: I0121 14:45:25.263615 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-jfkfq" Jan 21 14:45:35 crc kubenswrapper[4720]: I0121 14:45:35.220760 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-xczlv" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.708795 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-lvnwp"] Jan 21 14:45:52 crc kubenswrapper[4720]: E0121 14:45:52.709521 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56142c27-a9f6-4617-ad00-0d1cd7416732" containerName="collect-profiles" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.709533 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="56142c27-a9f6-4617-ad00-0d1cd7416732" containerName="collect-profiles" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.709682 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="56142c27-a9f6-4617-ad00-0d1cd7416732" containerName="collect-profiles" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.710496 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-lvnwp" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.719579 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-7pv8k" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.719800 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.721027 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.721143 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.734729 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-lvnwp"] Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.805116 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-56stq"] Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.806694 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-56stq" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.807765 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/baa470a6-13e1-47a6-a036-d9a5bab976e6-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-56stq\" (UID: \"baa470a6-13e1-47a6-a036-d9a5bab976e6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-56stq" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.807835 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg5kp\" (UniqueName: \"kubernetes.io/projected/baa470a6-13e1-47a6-a036-d9a5bab976e6-kube-api-access-kg5kp\") pod \"dnsmasq-dns-78dd6ddcc-56stq\" (UID: \"baa470a6-13e1-47a6-a036-d9a5bab976e6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-56stq" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.807873 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfgg5\" (UniqueName: \"kubernetes.io/projected/47e392d4-f48b-4079-afd3-a5d7fae209a8-kube-api-access-pfgg5\") pod \"dnsmasq-dns-675f4bcbfc-lvnwp\" (UID: \"47e392d4-f48b-4079-afd3-a5d7fae209a8\") " pod="openstack/dnsmasq-dns-675f4bcbfc-lvnwp" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.807896 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baa470a6-13e1-47a6-a036-d9a5bab976e6-config\") pod \"dnsmasq-dns-78dd6ddcc-56stq\" (UID: \"baa470a6-13e1-47a6-a036-d9a5bab976e6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-56stq" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.807930 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47e392d4-f48b-4079-afd3-a5d7fae209a8-config\") pod \"dnsmasq-dns-675f4bcbfc-lvnwp\" (UID: \"47e392d4-f48b-4079-afd3-a5d7fae209a8\") " pod="openstack/dnsmasq-dns-675f4bcbfc-lvnwp" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.808866 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.821618 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-56stq"] Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.880399 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.880451 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.908823 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47e392d4-f48b-4079-afd3-a5d7fae209a8-config\") pod \"dnsmasq-dns-675f4bcbfc-lvnwp\" (UID: \"47e392d4-f48b-4079-afd3-a5d7fae209a8\") " pod="openstack/dnsmasq-dns-675f4bcbfc-lvnwp" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.908899 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/baa470a6-13e1-47a6-a036-d9a5bab976e6-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-56stq\" (UID: \"baa470a6-13e1-47a6-a036-d9a5bab976e6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-56stq" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.908937 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg5kp\" (UniqueName: \"kubernetes.io/projected/baa470a6-13e1-47a6-a036-d9a5bab976e6-kube-api-access-kg5kp\") pod \"dnsmasq-dns-78dd6ddcc-56stq\" (UID: \"baa470a6-13e1-47a6-a036-d9a5bab976e6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-56stq" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.908975 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfgg5\" (UniqueName: \"kubernetes.io/projected/47e392d4-f48b-4079-afd3-a5d7fae209a8-kube-api-access-pfgg5\") pod \"dnsmasq-dns-675f4bcbfc-lvnwp\" (UID: \"47e392d4-f48b-4079-afd3-a5d7fae209a8\") " pod="openstack/dnsmasq-dns-675f4bcbfc-lvnwp" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.909012 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baa470a6-13e1-47a6-a036-d9a5bab976e6-config\") pod \"dnsmasq-dns-78dd6ddcc-56stq\" (UID: \"baa470a6-13e1-47a6-a036-d9a5bab976e6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-56stq" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.909995 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47e392d4-f48b-4079-afd3-a5d7fae209a8-config\") pod \"dnsmasq-dns-675f4bcbfc-lvnwp\" (UID: \"47e392d4-f48b-4079-afd3-a5d7fae209a8\") " pod="openstack/dnsmasq-dns-675f4bcbfc-lvnwp" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.910039 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baa470a6-13e1-47a6-a036-d9a5bab976e6-config\") pod \"dnsmasq-dns-78dd6ddcc-56stq\" (UID: \"baa470a6-13e1-47a6-a036-d9a5bab976e6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-56stq" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.910148 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/baa470a6-13e1-47a6-a036-d9a5bab976e6-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-56stq\" (UID: \"baa470a6-13e1-47a6-a036-d9a5bab976e6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-56stq" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.931851 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfgg5\" (UniqueName: \"kubernetes.io/projected/47e392d4-f48b-4079-afd3-a5d7fae209a8-kube-api-access-pfgg5\") pod \"dnsmasq-dns-675f4bcbfc-lvnwp\" (UID: \"47e392d4-f48b-4079-afd3-a5d7fae209a8\") " pod="openstack/dnsmasq-dns-675f4bcbfc-lvnwp" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.932157 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg5kp\" (UniqueName: \"kubernetes.io/projected/baa470a6-13e1-47a6-a036-d9a5bab976e6-kube-api-access-kg5kp\") pod \"dnsmasq-dns-78dd6ddcc-56stq\" (UID: \"baa470a6-13e1-47a6-a036-d9a5bab976e6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-56stq" Jan 21 14:45:53 crc kubenswrapper[4720]: I0121 14:45:53.036621 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-lvnwp" Jan 21 14:45:53 crc kubenswrapper[4720]: I0121 14:45:53.127824 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-56stq" Jan 21 14:45:53 crc kubenswrapper[4720]: I0121 14:45:53.496244 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-lvnwp"] Jan 21 14:45:53 crc kubenswrapper[4720]: I0121 14:45:53.542816 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-lvnwp" event={"ID":"47e392d4-f48b-4079-afd3-a5d7fae209a8","Type":"ContainerStarted","Data":"e472827db9139a7f614c97e83facefb5442c631000200b49113eeb8a3e5f4be5"} Jan 21 14:45:53 crc kubenswrapper[4720]: I0121 14:45:53.598370 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-56stq"] Jan 21 14:45:53 crc kubenswrapper[4720]: W0121 14:45:53.604627 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbaa470a6_13e1_47a6_a036_d9a5bab976e6.slice/crio-66ff83cf99122294432f39dacae5d6e428cce0897ec84f2469a57dde751a1a7d WatchSource:0}: Error finding container 66ff83cf99122294432f39dacae5d6e428cce0897ec84f2469a57dde751a1a7d: Status 404 returned error can't find the container with id 66ff83cf99122294432f39dacae5d6e428cce0897ec84f2469a57dde751a1a7d Jan 21 14:45:54 crc kubenswrapper[4720]: I0121 14:45:54.552273 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-56stq" event={"ID":"baa470a6-13e1-47a6-a036-d9a5bab976e6","Type":"ContainerStarted","Data":"66ff83cf99122294432f39dacae5d6e428cce0897ec84f2469a57dde751a1a7d"} Jan 21 14:45:55 crc kubenswrapper[4720]: I0121 14:45:55.635556 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-lvnwp"] Jan 21 14:45:55 crc kubenswrapper[4720]: I0121 14:45:55.674029 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-l76kh"] Jan 21 14:45:55 crc kubenswrapper[4720]: I0121 14:45:55.684163 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-l76kh" Jan 21 14:45:55 crc kubenswrapper[4720]: I0121 14:45:55.696688 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-l76kh"] Jan 21 14:45:55 crc kubenswrapper[4720]: I0121 14:45:55.752899 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b076fbb-9c67-4e19-a9e6-1acb75a52cb8-dns-svc\") pod \"dnsmasq-dns-666b6646f7-l76kh\" (UID: \"7b076fbb-9c67-4e19-a9e6-1acb75a52cb8\") " pod="openstack/dnsmasq-dns-666b6646f7-l76kh" Jan 21 14:45:55 crc kubenswrapper[4720]: I0121 14:45:55.753270 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b076fbb-9c67-4e19-a9e6-1acb75a52cb8-config\") pod \"dnsmasq-dns-666b6646f7-l76kh\" (UID: \"7b076fbb-9c67-4e19-a9e6-1acb75a52cb8\") " pod="openstack/dnsmasq-dns-666b6646f7-l76kh" Jan 21 14:45:55 crc kubenswrapper[4720]: I0121 14:45:55.753299 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfjd2\" (UniqueName: \"kubernetes.io/projected/7b076fbb-9c67-4e19-a9e6-1acb75a52cb8-kube-api-access-gfjd2\") pod \"dnsmasq-dns-666b6646f7-l76kh\" (UID: \"7b076fbb-9c67-4e19-a9e6-1acb75a52cb8\") " pod="openstack/dnsmasq-dns-666b6646f7-l76kh" Jan 21 14:45:55 crc kubenswrapper[4720]: I0121 14:45:55.854508 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b076fbb-9c67-4e19-a9e6-1acb75a52cb8-config\") pod \"dnsmasq-dns-666b6646f7-l76kh\" (UID: \"7b076fbb-9c67-4e19-a9e6-1acb75a52cb8\") " pod="openstack/dnsmasq-dns-666b6646f7-l76kh" Jan 21 14:45:55 crc kubenswrapper[4720]: I0121 14:45:55.854552 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfjd2\" (UniqueName: \"kubernetes.io/projected/7b076fbb-9c67-4e19-a9e6-1acb75a52cb8-kube-api-access-gfjd2\") pod \"dnsmasq-dns-666b6646f7-l76kh\" (UID: \"7b076fbb-9c67-4e19-a9e6-1acb75a52cb8\") " pod="openstack/dnsmasq-dns-666b6646f7-l76kh" Jan 21 14:45:55 crc kubenswrapper[4720]: I0121 14:45:55.854643 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b076fbb-9c67-4e19-a9e6-1acb75a52cb8-dns-svc\") pod \"dnsmasq-dns-666b6646f7-l76kh\" (UID: \"7b076fbb-9c67-4e19-a9e6-1acb75a52cb8\") " pod="openstack/dnsmasq-dns-666b6646f7-l76kh" Jan 21 14:45:55 crc kubenswrapper[4720]: I0121 14:45:55.855586 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b076fbb-9c67-4e19-a9e6-1acb75a52cb8-dns-svc\") pod \"dnsmasq-dns-666b6646f7-l76kh\" (UID: \"7b076fbb-9c67-4e19-a9e6-1acb75a52cb8\") " pod="openstack/dnsmasq-dns-666b6646f7-l76kh" Jan 21 14:45:55 crc kubenswrapper[4720]: I0121 14:45:55.855929 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b076fbb-9c67-4e19-a9e6-1acb75a52cb8-config\") pod \"dnsmasq-dns-666b6646f7-l76kh\" (UID: \"7b076fbb-9c67-4e19-a9e6-1acb75a52cb8\") " pod="openstack/dnsmasq-dns-666b6646f7-l76kh" Jan 21 14:45:55 crc kubenswrapper[4720]: I0121 14:45:55.888746 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfjd2\" (UniqueName: \"kubernetes.io/projected/7b076fbb-9c67-4e19-a9e6-1acb75a52cb8-kube-api-access-gfjd2\") pod \"dnsmasq-dns-666b6646f7-l76kh\" (UID: \"7b076fbb-9c67-4e19-a9e6-1acb75a52cb8\") " pod="openstack/dnsmasq-dns-666b6646f7-l76kh" Jan 21 14:45:55 crc kubenswrapper[4720]: I0121 14:45:55.991562 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-56stq"] Jan 21 14:45:56 crc kubenswrapper[4720]: I0121 14:45:56.012341 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-l76kh" Jan 21 14:45:56 crc kubenswrapper[4720]: I0121 14:45:56.021569 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-nbfkp"] Jan 21 14:45:56 crc kubenswrapper[4720]: I0121 14:45:56.026034 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" Jan 21 14:45:56 crc kubenswrapper[4720]: I0121 14:45:56.042899 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-nbfkp"] Jan 21 14:45:56 crc kubenswrapper[4720]: I0121 14:45:56.165258 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7bf7a9dc-02fc-4976-afd7-2e172728b008-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-nbfkp\" (UID: \"7bf7a9dc-02fc-4976-afd7-2e172728b008\") " pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" Jan 21 14:45:56 crc kubenswrapper[4720]: I0121 14:45:56.165412 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t64nn\" (UniqueName: \"kubernetes.io/projected/7bf7a9dc-02fc-4976-afd7-2e172728b008-kube-api-access-t64nn\") pod \"dnsmasq-dns-57d769cc4f-nbfkp\" (UID: \"7bf7a9dc-02fc-4976-afd7-2e172728b008\") " pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" Jan 21 14:45:56 crc kubenswrapper[4720]: I0121 14:45:56.165937 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bf7a9dc-02fc-4976-afd7-2e172728b008-config\") pod \"dnsmasq-dns-57d769cc4f-nbfkp\" (UID: \"7bf7a9dc-02fc-4976-afd7-2e172728b008\") " pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" Jan 21 14:45:56 crc kubenswrapper[4720]: I0121 14:45:56.267121 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7bf7a9dc-02fc-4976-afd7-2e172728b008-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-nbfkp\" (UID: \"7bf7a9dc-02fc-4976-afd7-2e172728b008\") " pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" Jan 21 14:45:56 crc kubenswrapper[4720]: I0121 14:45:56.267497 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t64nn\" (UniqueName: \"kubernetes.io/projected/7bf7a9dc-02fc-4976-afd7-2e172728b008-kube-api-access-t64nn\") pod \"dnsmasq-dns-57d769cc4f-nbfkp\" (UID: \"7bf7a9dc-02fc-4976-afd7-2e172728b008\") " pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" Jan 21 14:45:56 crc kubenswrapper[4720]: I0121 14:45:56.267594 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bf7a9dc-02fc-4976-afd7-2e172728b008-config\") pod \"dnsmasq-dns-57d769cc4f-nbfkp\" (UID: \"7bf7a9dc-02fc-4976-afd7-2e172728b008\") " pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" Jan 21 14:45:56 crc kubenswrapper[4720]: I0121 14:45:56.268560 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7bf7a9dc-02fc-4976-afd7-2e172728b008-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-nbfkp\" (UID: \"7bf7a9dc-02fc-4976-afd7-2e172728b008\") " pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" Jan 21 14:45:56 crc kubenswrapper[4720]: I0121 14:45:56.268851 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bf7a9dc-02fc-4976-afd7-2e172728b008-config\") pod \"dnsmasq-dns-57d769cc4f-nbfkp\" (UID: \"7bf7a9dc-02fc-4976-afd7-2e172728b008\") " pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" Jan 21 14:45:56 crc kubenswrapper[4720]: I0121 14:45:56.300438 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t64nn\" (UniqueName: \"kubernetes.io/projected/7bf7a9dc-02fc-4976-afd7-2e172728b008-kube-api-access-t64nn\") pod \"dnsmasq-dns-57d769cc4f-nbfkp\" (UID: \"7bf7a9dc-02fc-4976-afd7-2e172728b008\") " pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" Jan 21 14:45:56 crc kubenswrapper[4720]: I0121 14:45:56.370777 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" Jan 21 14:45:56 crc kubenswrapper[4720]: I0121 14:45:56.589250 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-l76kh"] Jan 21 14:45:56 crc kubenswrapper[4720]: W0121 14:45:56.602764 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b076fbb_9c67_4e19_a9e6_1acb75a52cb8.slice/crio-595d585b971e43804af671995b49cfdd137a2f109ac81b7c395b8867814a6871 WatchSource:0}: Error finding container 595d585b971e43804af671995b49cfdd137a2f109ac81b7c395b8867814a6871: Status 404 returned error can't find the container with id 595d585b971e43804af671995b49cfdd137a2f109ac81b7c395b8867814a6871 Jan 21 14:45:56 crc kubenswrapper[4720]: I0121 14:45:56.875006 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-nbfkp"] Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.087555 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.088676 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.095323 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.095483 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.095622 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.095644 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.095847 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.095917 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-qrxkj" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.105461 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.120888 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.235172 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.236337 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.248581 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.248949 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-d7vj6" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.249130 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.249295 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.249452 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.250345 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.257291 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.270317 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.283375 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3a2eafda-c352-4311-94d5-a1aec1422699-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.283433 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3a2eafda-c352-4311-94d5-a1aec1422699-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.283507 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lndbj\" (UniqueName: \"kubernetes.io/projected/3a2eafda-c352-4311-94d5-a1aec1422699-kube-api-access-lndbj\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.283545 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3a2eafda-c352-4311-94d5-a1aec1422699-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.283594 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.283632 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a2eafda-c352-4311-94d5-a1aec1422699-config-data\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.283684 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.283711 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.283769 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.283848 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3a2eafda-c352-4311-94d5-a1aec1422699-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.283883 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.386505 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c1752995-abec-46de-adf8-da9e3ed99d4a-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.386560 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.386599 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c1752995-abec-46de-adf8-da9e3ed99d4a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.386645 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3a2eafda-c352-4311-94d5-a1aec1422699-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.386693 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c1752995-abec-46de-adf8-da9e3ed99d4a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.386724 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.386750 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.386788 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.386820 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f5db\" (UniqueName: \"kubernetes.io/projected/c1752995-abec-46de-adf8-da9e3ed99d4a-kube-api-access-4f5db\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.386852 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3a2eafda-c352-4311-94d5-a1aec1422699-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.386874 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3a2eafda-c352-4311-94d5-a1aec1422699-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.386977 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lndbj\" (UniqueName: \"kubernetes.io/projected/3a2eafda-c352-4311-94d5-a1aec1422699-kube-api-access-lndbj\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.387077 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c1752995-abec-46de-adf8-da9e3ed99d4a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.387146 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3a2eafda-c352-4311-94d5-a1aec1422699-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.387211 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.387247 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.387297 4720 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.387314 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c1752995-abec-46de-adf8-da9e3ed99d4a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.387358 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a2eafda-c352-4311-94d5-a1aec1422699-config-data\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.387381 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.387401 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.387426 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.387472 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.387876 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3a2eafda-c352-4311-94d5-a1aec1422699-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.387968 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.389321 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3a2eafda-c352-4311-94d5-a1aec1422699-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.389606 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.391024 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a2eafda-c352-4311-94d5-a1aec1422699-config-data\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.408145 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.408356 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3a2eafda-c352-4311-94d5-a1aec1422699-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.408490 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.414589 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lndbj\" (UniqueName: \"kubernetes.io/projected/3a2eafda-c352-4311-94d5-a1aec1422699-kube-api-access-lndbj\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.425443 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3a2eafda-c352-4311-94d5-a1aec1422699-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.429957 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.443138 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.489594 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c1752995-abec-46de-adf8-da9e3ed99d4a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.489679 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c1752995-abec-46de-adf8-da9e3ed99d4a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.489702 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.489722 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.489760 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f5db\" (UniqueName: \"kubernetes.io/projected/c1752995-abec-46de-adf8-da9e3ed99d4a-kube-api-access-4f5db\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.489790 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c1752995-abec-46de-adf8-da9e3ed99d4a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.489843 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.489872 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c1752995-abec-46de-adf8-da9e3ed99d4a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.489911 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.489926 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.489948 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c1752995-abec-46de-adf8-da9e3ed99d4a-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.491047 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c1752995-abec-46de-adf8-da9e3ed99d4a-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.491572 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c1752995-abec-46de-adf8-da9e3ed99d4a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.491672 4720 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.492366 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c1752995-abec-46de-adf8-da9e3ed99d4a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.493427 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.493802 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.494544 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c1752995-abec-46de-adf8-da9e3ed99d4a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.495088 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c1752995-abec-46de-adf8-da9e3ed99d4a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.500896 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.505384 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.523530 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f5db\" (UniqueName: \"kubernetes.io/projected/c1752995-abec-46de-adf8-da9e3ed99d4a-kube-api-access-4f5db\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.570043 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.604847 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-l76kh" event={"ID":"7b076fbb-9c67-4e19-a9e6-1acb75a52cb8","Type":"ContainerStarted","Data":"595d585b971e43804af671995b49cfdd137a2f109ac81b7c395b8867814a6871"} Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.868160 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.373136 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.383313 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.383451 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.392989 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-bn29f" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.393264 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.393454 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.393858 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.396885 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.507837 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ab11441b-6bc4-4883-8a1e-866b31b425e9-config-data-default\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.507888 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab11441b-6bc4-4883-8a1e-866b31b425e9-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.507930 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab11441b-6bc4-4883-8a1e-866b31b425e9-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.507950 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shtf9\" (UniqueName: \"kubernetes.io/projected/ab11441b-6bc4-4883-8a1e-866b31b425e9-kube-api-access-shtf9\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.507976 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.507995 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ab11441b-6bc4-4883-8a1e-866b31b425e9-kolla-config\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.508284 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab11441b-6bc4-4883-8a1e-866b31b425e9-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.508407 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ab11441b-6bc4-4883-8a1e-866b31b425e9-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.609622 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ab11441b-6bc4-4883-8a1e-866b31b425e9-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.609708 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ab11441b-6bc4-4883-8a1e-866b31b425e9-config-data-default\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.609724 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab11441b-6bc4-4883-8a1e-866b31b425e9-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.609750 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab11441b-6bc4-4883-8a1e-866b31b425e9-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.609781 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shtf9\" (UniqueName: \"kubernetes.io/projected/ab11441b-6bc4-4883-8a1e-866b31b425e9-kube-api-access-shtf9\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.609796 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.609813 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ab11441b-6bc4-4883-8a1e-866b31b425e9-kolla-config\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.609878 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab11441b-6bc4-4883-8a1e-866b31b425e9-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.610224 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ab11441b-6bc4-4883-8a1e-866b31b425e9-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.610650 4720 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.611088 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ab11441b-6bc4-4883-8a1e-866b31b425e9-config-data-default\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.612437 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab11441b-6bc4-4883-8a1e-866b31b425e9-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.613013 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ab11441b-6bc4-4883-8a1e-866b31b425e9-kolla-config\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.618936 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab11441b-6bc4-4883-8a1e-866b31b425e9-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.619995 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab11441b-6bc4-4883-8a1e-866b31b425e9-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.634995 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.638936 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shtf9\" (UniqueName: \"kubernetes.io/projected/ab11441b-6bc4-4883-8a1e-866b31b425e9-kube-api-access-shtf9\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.710938 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.555650 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.556993 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.562971 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.565266 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.566393 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-hb7pt" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.563218 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.573139 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.629110 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.629150 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.629174 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.629192 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.629518 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.629536 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gljtn\" (UniqueName: \"kubernetes.io/projected/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-kube-api-access-gljtn\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.629557 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.629571 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.731237 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.731293 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.731345 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.731370 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.731452 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.731494 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gljtn\" (UniqueName: \"kubernetes.io/projected/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-kube-api-access-gljtn\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.731527 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.731573 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.732172 4720 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.732317 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.735959 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.736385 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.736681 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.744125 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.747368 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.762800 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gljtn\" (UniqueName: \"kubernetes.io/projected/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-kube-api-access-gljtn\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.763256 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.895646 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.936324 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.937628 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.940804 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-pf64s" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.941401 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.942881 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.951324 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 21 14:46:00 crc kubenswrapper[4720]: I0121 14:46:00.036137 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8gwp\" (UniqueName: \"kubernetes.io/projected/73c29d26-d7a2-40b5-81b8-ffda85c198d3-kube-api-access-l8gwp\") pod \"memcached-0\" (UID: \"73c29d26-d7a2-40b5-81b8-ffda85c198d3\") " pod="openstack/memcached-0" Jan 21 14:46:00 crc kubenswrapper[4720]: I0121 14:46:00.036177 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/73c29d26-d7a2-40b5-81b8-ffda85c198d3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"73c29d26-d7a2-40b5-81b8-ffda85c198d3\") " pod="openstack/memcached-0" Jan 21 14:46:00 crc kubenswrapper[4720]: I0121 14:46:00.036210 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/73c29d26-d7a2-40b5-81b8-ffda85c198d3-kolla-config\") pod \"memcached-0\" (UID: \"73c29d26-d7a2-40b5-81b8-ffda85c198d3\") " pod="openstack/memcached-0" Jan 21 14:46:00 crc kubenswrapper[4720]: I0121 14:46:00.036229 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/73c29d26-d7a2-40b5-81b8-ffda85c198d3-config-data\") pod \"memcached-0\" (UID: \"73c29d26-d7a2-40b5-81b8-ffda85c198d3\") " pod="openstack/memcached-0" Jan 21 14:46:00 crc kubenswrapper[4720]: I0121 14:46:00.036286 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c29d26-d7a2-40b5-81b8-ffda85c198d3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"73c29d26-d7a2-40b5-81b8-ffda85c198d3\") " pod="openstack/memcached-0" Jan 21 14:46:00 crc kubenswrapper[4720]: I0121 14:46:00.137678 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8gwp\" (UniqueName: \"kubernetes.io/projected/73c29d26-d7a2-40b5-81b8-ffda85c198d3-kube-api-access-l8gwp\") pod \"memcached-0\" (UID: \"73c29d26-d7a2-40b5-81b8-ffda85c198d3\") " pod="openstack/memcached-0" Jan 21 14:46:00 crc kubenswrapper[4720]: I0121 14:46:00.137717 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/73c29d26-d7a2-40b5-81b8-ffda85c198d3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"73c29d26-d7a2-40b5-81b8-ffda85c198d3\") " pod="openstack/memcached-0" Jan 21 14:46:00 crc kubenswrapper[4720]: I0121 14:46:00.137750 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/73c29d26-d7a2-40b5-81b8-ffda85c198d3-kolla-config\") pod \"memcached-0\" (UID: \"73c29d26-d7a2-40b5-81b8-ffda85c198d3\") " pod="openstack/memcached-0" Jan 21 14:46:00 crc kubenswrapper[4720]: I0121 14:46:00.137767 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/73c29d26-d7a2-40b5-81b8-ffda85c198d3-config-data\") pod \"memcached-0\" (UID: \"73c29d26-d7a2-40b5-81b8-ffda85c198d3\") " pod="openstack/memcached-0" Jan 21 14:46:00 crc kubenswrapper[4720]: I0121 14:46:00.137831 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c29d26-d7a2-40b5-81b8-ffda85c198d3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"73c29d26-d7a2-40b5-81b8-ffda85c198d3\") " pod="openstack/memcached-0" Jan 21 14:46:00 crc kubenswrapper[4720]: I0121 14:46:00.138787 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/73c29d26-d7a2-40b5-81b8-ffda85c198d3-config-data\") pod \"memcached-0\" (UID: \"73c29d26-d7a2-40b5-81b8-ffda85c198d3\") " pod="openstack/memcached-0" Jan 21 14:46:00 crc kubenswrapper[4720]: I0121 14:46:00.138856 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/73c29d26-d7a2-40b5-81b8-ffda85c198d3-kolla-config\") pod \"memcached-0\" (UID: \"73c29d26-d7a2-40b5-81b8-ffda85c198d3\") " pod="openstack/memcached-0" Jan 21 14:46:00 crc kubenswrapper[4720]: I0121 14:46:00.145143 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/73c29d26-d7a2-40b5-81b8-ffda85c198d3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"73c29d26-d7a2-40b5-81b8-ffda85c198d3\") " pod="openstack/memcached-0" Jan 21 14:46:00 crc kubenswrapper[4720]: I0121 14:46:00.149111 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c29d26-d7a2-40b5-81b8-ffda85c198d3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"73c29d26-d7a2-40b5-81b8-ffda85c198d3\") " pod="openstack/memcached-0" Jan 21 14:46:00 crc kubenswrapper[4720]: I0121 14:46:00.162279 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8gwp\" (UniqueName: \"kubernetes.io/projected/73c29d26-d7a2-40b5-81b8-ffda85c198d3-kube-api-access-l8gwp\") pod \"memcached-0\" (UID: \"73c29d26-d7a2-40b5-81b8-ffda85c198d3\") " pod="openstack/memcached-0" Jan 21 14:46:00 crc kubenswrapper[4720]: I0121 14:46:00.256939 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 21 14:46:01 crc kubenswrapper[4720]: W0121 14:46:01.272574 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7bf7a9dc_02fc_4976_afd7_2e172728b008.slice/crio-6650a88dae698abc02e358a7f68ed4173c431f0402a2eee9296a8d8cc7459c5d WatchSource:0}: Error finding container 6650a88dae698abc02e358a7f68ed4173c431f0402a2eee9296a8d8cc7459c5d: Status 404 returned error can't find the container with id 6650a88dae698abc02e358a7f68ed4173c431f0402a2eee9296a8d8cc7459c5d Jan 21 14:46:01 crc kubenswrapper[4720]: I0121 14:46:01.659615 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" event={"ID":"7bf7a9dc-02fc-4976-afd7-2e172728b008","Type":"ContainerStarted","Data":"6650a88dae698abc02e358a7f68ed4173c431f0402a2eee9296a8d8cc7459c5d"} Jan 21 14:46:01 crc kubenswrapper[4720]: I0121 14:46:01.733060 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 21 14:46:01 crc kubenswrapper[4720]: I0121 14:46:01.742044 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 21 14:46:01 crc kubenswrapper[4720]: I0121 14:46:01.745471 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 21 14:46:01 crc kubenswrapper[4720]: I0121 14:46:01.747714 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-tqvx9" Jan 21 14:46:01 crc kubenswrapper[4720]: I0121 14:46:01.870888 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcq8h\" (UniqueName: \"kubernetes.io/projected/ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7-kube-api-access-mcq8h\") pod \"kube-state-metrics-0\" (UID: \"ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7\") " pod="openstack/kube-state-metrics-0" Jan 21 14:46:01 crc kubenswrapper[4720]: I0121 14:46:01.973367 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcq8h\" (UniqueName: \"kubernetes.io/projected/ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7-kube-api-access-mcq8h\") pod \"kube-state-metrics-0\" (UID: \"ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7\") " pod="openstack/kube-state-metrics-0" Jan 21 14:46:01 crc kubenswrapper[4720]: I0121 14:46:01.999215 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcq8h\" (UniqueName: \"kubernetes.io/projected/ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7-kube-api-access-mcq8h\") pod \"kube-state-metrics-0\" (UID: \"ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7\") " pod="openstack/kube-state-metrics-0" Jan 21 14:46:02 crc kubenswrapper[4720]: I0121 14:46:02.060368 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 21 14:46:04 crc kubenswrapper[4720]: I0121 14:46:04.953550 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-wpvzs"] Jan 21 14:46:04 crc kubenswrapper[4720]: I0121 14:46:04.957982 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:04 crc kubenswrapper[4720]: I0121 14:46:04.966899 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Jan 21 14:46:04 crc kubenswrapper[4720]: I0121 14:46:04.967061 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Jan 21 14:46:04 crc kubenswrapper[4720]: I0121 14:46:04.967084 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-9sf8m" Jan 21 14:46:04 crc kubenswrapper[4720]: I0121 14:46:04.969476 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wpvzs"] Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.003794 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-2v7f2"] Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.005303 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.032361 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-2v7f2"] Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.034465 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nspjf\" (UniqueName: \"kubernetes.io/projected/95379233-3cd8-4dd3-bf0f-b8198f2258e1-kube-api-access-nspjf\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.034506 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/95379233-3cd8-4dd3-bf0f-b8198f2258e1-ovn-controller-tls-certs\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.034536 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/95379233-3cd8-4dd3-bf0f-b8198f2258e1-var-log-ovn\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.034574 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95379233-3cd8-4dd3-bf0f-b8198f2258e1-combined-ca-bundle\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.034607 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/95379233-3cd8-4dd3-bf0f-b8198f2258e1-var-run-ovn\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.034626 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/95379233-3cd8-4dd3-bf0f-b8198f2258e1-var-run\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.034652 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95379233-3cd8-4dd3-bf0f-b8198f2258e1-scripts\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.136526 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/95379233-3cd8-4dd3-bf0f-b8198f2258e1-ovn-controller-tls-certs\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.136597 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/04da7387-73aa-43e0-b547-7ce56e71d865-var-lib\") pod \"ovn-controller-ovs-2v7f2\" (UID: \"04da7387-73aa-43e0-b547-7ce56e71d865\") " pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.136629 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/95379233-3cd8-4dd3-bf0f-b8198f2258e1-var-log-ovn\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.136682 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/04da7387-73aa-43e0-b547-7ce56e71d865-var-log\") pod \"ovn-controller-ovs-2v7f2\" (UID: \"04da7387-73aa-43e0-b547-7ce56e71d865\") " pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.136708 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95379233-3cd8-4dd3-bf0f-b8198f2258e1-combined-ca-bundle\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.136748 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/04da7387-73aa-43e0-b547-7ce56e71d865-etc-ovs\") pod \"ovn-controller-ovs-2v7f2\" (UID: \"04da7387-73aa-43e0-b547-7ce56e71d865\") " pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.136787 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04da7387-73aa-43e0-b547-7ce56e71d865-scripts\") pod \"ovn-controller-ovs-2v7f2\" (UID: \"04da7387-73aa-43e0-b547-7ce56e71d865\") " pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.136813 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/95379233-3cd8-4dd3-bf0f-b8198f2258e1-var-run-ovn\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.136836 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/95379233-3cd8-4dd3-bf0f-b8198f2258e1-var-run\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.136870 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95379233-3cd8-4dd3-bf0f-b8198f2258e1-scripts\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.136902 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbzng\" (UniqueName: \"kubernetes.io/projected/04da7387-73aa-43e0-b547-7ce56e71d865-kube-api-access-pbzng\") pod \"ovn-controller-ovs-2v7f2\" (UID: \"04da7387-73aa-43e0-b547-7ce56e71d865\") " pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.137006 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nspjf\" (UniqueName: \"kubernetes.io/projected/95379233-3cd8-4dd3-bf0f-b8198f2258e1-kube-api-access-nspjf\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.137035 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/04da7387-73aa-43e0-b547-7ce56e71d865-var-run\") pod \"ovn-controller-ovs-2v7f2\" (UID: \"04da7387-73aa-43e0-b547-7ce56e71d865\") " pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.137317 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/95379233-3cd8-4dd3-bf0f-b8198f2258e1-var-run-ovn\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.137340 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/95379233-3cd8-4dd3-bf0f-b8198f2258e1-var-run\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.137956 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/95379233-3cd8-4dd3-bf0f-b8198f2258e1-var-log-ovn\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.140899 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95379233-3cd8-4dd3-bf0f-b8198f2258e1-scripts\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.155286 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95379233-3cd8-4dd3-bf0f-b8198f2258e1-combined-ca-bundle\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.158046 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/95379233-3cd8-4dd3-bf0f-b8198f2258e1-ovn-controller-tls-certs\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.158216 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nspjf\" (UniqueName: \"kubernetes.io/projected/95379233-3cd8-4dd3-bf0f-b8198f2258e1-kube-api-access-nspjf\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.240516 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/04da7387-73aa-43e0-b547-7ce56e71d865-var-run\") pod \"ovn-controller-ovs-2v7f2\" (UID: \"04da7387-73aa-43e0-b547-7ce56e71d865\") " pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.240586 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/04da7387-73aa-43e0-b547-7ce56e71d865-var-lib\") pod \"ovn-controller-ovs-2v7f2\" (UID: \"04da7387-73aa-43e0-b547-7ce56e71d865\") " pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.240617 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/04da7387-73aa-43e0-b547-7ce56e71d865-var-log\") pod \"ovn-controller-ovs-2v7f2\" (UID: \"04da7387-73aa-43e0-b547-7ce56e71d865\") " pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.240678 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/04da7387-73aa-43e0-b547-7ce56e71d865-etc-ovs\") pod \"ovn-controller-ovs-2v7f2\" (UID: \"04da7387-73aa-43e0-b547-7ce56e71d865\") " pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.240727 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04da7387-73aa-43e0-b547-7ce56e71d865-scripts\") pod \"ovn-controller-ovs-2v7f2\" (UID: \"04da7387-73aa-43e0-b547-7ce56e71d865\") " pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.240782 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbzng\" (UniqueName: \"kubernetes.io/projected/04da7387-73aa-43e0-b547-7ce56e71d865-kube-api-access-pbzng\") pod \"ovn-controller-ovs-2v7f2\" (UID: \"04da7387-73aa-43e0-b547-7ce56e71d865\") " pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.243274 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/04da7387-73aa-43e0-b547-7ce56e71d865-var-log\") pod \"ovn-controller-ovs-2v7f2\" (UID: \"04da7387-73aa-43e0-b547-7ce56e71d865\") " pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.243350 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/04da7387-73aa-43e0-b547-7ce56e71d865-etc-ovs\") pod \"ovn-controller-ovs-2v7f2\" (UID: \"04da7387-73aa-43e0-b547-7ce56e71d865\") " pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.243378 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/04da7387-73aa-43e0-b547-7ce56e71d865-var-run\") pod \"ovn-controller-ovs-2v7f2\" (UID: \"04da7387-73aa-43e0-b547-7ce56e71d865\") " pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.243708 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/04da7387-73aa-43e0-b547-7ce56e71d865-var-lib\") pod \"ovn-controller-ovs-2v7f2\" (UID: \"04da7387-73aa-43e0-b547-7ce56e71d865\") " pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.245751 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04da7387-73aa-43e0-b547-7ce56e71d865-scripts\") pod \"ovn-controller-ovs-2v7f2\" (UID: \"04da7387-73aa-43e0-b547-7ce56e71d865\") " pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.263395 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbzng\" (UniqueName: \"kubernetes.io/projected/04da7387-73aa-43e0-b547-7ce56e71d865-kube-api-access-pbzng\") pod \"ovn-controller-ovs-2v7f2\" (UID: \"04da7387-73aa-43e0-b547-7ce56e71d865\") " pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.305797 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.328669 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:06 crc kubenswrapper[4720]: I0121 14:46:06.450250 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 21 14:46:06 crc kubenswrapper[4720]: I0121 14:46:06.451980 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:06 crc kubenswrapper[4720]: I0121 14:46:06.458004 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Jan 21 14:46:06 crc kubenswrapper[4720]: I0121 14:46:06.458177 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-x8xgt" Jan 21 14:46:06 crc kubenswrapper[4720]: I0121 14:46:06.458357 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Jan 21 14:46:06 crc kubenswrapper[4720]: I0121 14:46:06.458530 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Jan 21 14:46:06 crc kubenswrapper[4720]: I0121 14:46:06.458764 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.074800 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.094882 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8cf4740-b779-4759-92d1-22ce3e5f1369-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.094945 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpvqs\" (UniqueName: \"kubernetes.io/projected/e8cf4740-b779-4759-92d1-22ce3e5f1369-kube-api-access-lpvqs\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.095021 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8cf4740-b779-4759-92d1-22ce3e5f1369-config\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.095077 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8cf4740-b779-4759-92d1-22ce3e5f1369-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.095106 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e8cf4740-b779-4759-92d1-22ce3e5f1369-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.095134 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8cf4740-b779-4759-92d1-22ce3e5f1369-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.095211 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8cf4740-b779-4759-92d1-22ce3e5f1369-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.095287 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.196468 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.196766 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8cf4740-b779-4759-92d1-22ce3e5f1369-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.196884 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpvqs\" (UniqueName: \"kubernetes.io/projected/e8cf4740-b779-4759-92d1-22ce3e5f1369-kube-api-access-lpvqs\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.196937 4720 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.197146 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8cf4740-b779-4759-92d1-22ce3e5f1369-config\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.197279 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8cf4740-b779-4759-92d1-22ce3e5f1369-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.197316 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e8cf4740-b779-4759-92d1-22ce3e5f1369-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.197356 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8cf4740-b779-4759-92d1-22ce3e5f1369-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.197391 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8cf4740-b779-4759-92d1-22ce3e5f1369-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.197862 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e8cf4740-b779-4759-92d1-22ce3e5f1369-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.198198 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8cf4740-b779-4759-92d1-22ce3e5f1369-config\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.198970 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8cf4740-b779-4759-92d1-22ce3e5f1369-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.202993 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8cf4740-b779-4759-92d1-22ce3e5f1369-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.203807 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8cf4740-b779-4759-92d1-22ce3e5f1369-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.204839 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8cf4740-b779-4759-92d1-22ce3e5f1369-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.219613 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.220425 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpvqs\" (UniqueName: \"kubernetes.io/projected/e8cf4740-b779-4759-92d1-22ce3e5f1369-kube-api-access-lpvqs\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.366872 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:09 crc kubenswrapper[4720]: I0121 14:46:09.195708 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.123453 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.125228 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.129313 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-8q9gq" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.130623 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.130930 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.132068 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.138699 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.267360 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.267446 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.267493 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.267556 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.267592 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.267620 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-config\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.267673 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.267725 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plnr5\" (UniqueName: \"kubernetes.io/projected/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-kube-api-access-plnr5\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.368694 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.368755 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.368805 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.368833 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.368854 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-config\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.368894 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.368923 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plnr5\" (UniqueName: \"kubernetes.io/projected/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-kube-api-access-plnr5\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.368955 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.369298 4720 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.370026 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.370402 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-config\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.370552 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.375075 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.375280 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.377000 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.390925 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.392006 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plnr5\" (UniqueName: \"kubernetes.io/projected/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-kube-api-access-plnr5\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.460227 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:22 crc kubenswrapper[4720]: I0121 14:46:22.880342 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:46:22 crc kubenswrapper[4720]: I0121 14:46:22.881520 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:46:29 crc kubenswrapper[4720]: E0121 14:46:29.176352 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 21 14:46:29 crc kubenswrapper[4720]: E0121 14:46:29.177031 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gfjd2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-l76kh_openstack(7b076fbb-9c67-4e19-a9e6-1acb75a52cb8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:46:29 crc kubenswrapper[4720]: E0121 14:46:29.178372 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-l76kh" podUID="7b076fbb-9c67-4e19-a9e6-1acb75a52cb8" Jan 21 14:46:29 crc kubenswrapper[4720]: I0121 14:46:29.209736 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c1752995-abec-46de-adf8-da9e3ed99d4a","Type":"ContainerStarted","Data":"348934cdbf75477f1ab960f3f1053dff6dbf9d2daa8c4387234ea6851e521a6d"} Jan 21 14:46:29 crc kubenswrapper[4720]: E0121 14:46:29.212806 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-l76kh" podUID="7b076fbb-9c67-4e19-a9e6-1acb75a52cb8" Jan 21 14:46:29 crc kubenswrapper[4720]: E0121 14:46:29.235180 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 21 14:46:29 crc kubenswrapper[4720]: E0121 14:46:29.235543 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kg5kp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-56stq_openstack(baa470a6-13e1-47a6-a036-d9a5bab976e6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:46:29 crc kubenswrapper[4720]: E0121 14:46:29.237275 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-56stq" podUID="baa470a6-13e1-47a6-a036-d9a5bab976e6" Jan 21 14:46:29 crc kubenswrapper[4720]: E0121 14:46:29.292726 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 21 14:46:29 crc kubenswrapper[4720]: E0121 14:46:29.292869 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t64nn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-nbfkp_openstack(7bf7a9dc-02fc-4976-afd7-2e172728b008): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:46:29 crc kubenswrapper[4720]: E0121 14:46:29.294460 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" podUID="7bf7a9dc-02fc-4976-afd7-2e172728b008" Jan 21 14:46:29 crc kubenswrapper[4720]: E0121 14:46:29.340851 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 21 14:46:29 crc kubenswrapper[4720]: E0121 14:46:29.341019 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pfgg5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-lvnwp_openstack(47e392d4-f48b-4079-afd3-a5d7fae209a8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:46:29 crc kubenswrapper[4720]: E0121 14:46:29.342318 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-lvnwp" podUID="47e392d4-f48b-4079-afd3-a5d7fae209a8" Jan 21 14:46:29 crc kubenswrapper[4720]: I0121 14:46:29.993216 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 21 14:46:30 crc kubenswrapper[4720]: I0121 14:46:30.018574 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wpvzs"] Jan 21 14:46:30 crc kubenswrapper[4720]: I0121 14:46:30.030202 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 21 14:46:30 crc kubenswrapper[4720]: W0121 14:46:30.042211 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a6a2220_24c4_4a0b_b72e_848dbac6a14b.slice/crio-2abcef7bbdeb906aad91b06678021ef36456c0d3c8dd5748e9e075ac93d7913e WatchSource:0}: Error finding container 2abcef7bbdeb906aad91b06678021ef36456c0d3c8dd5748e9e075ac93d7913e: Status 404 returned error can't find the container with id 2abcef7bbdeb906aad91b06678021ef36456c0d3c8dd5748e9e075ac93d7913e Jan 21 14:46:30 crc kubenswrapper[4720]: I0121 14:46:30.216264 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wpvzs" event={"ID":"95379233-3cd8-4dd3-bf0f-b8198f2258e1","Type":"ContainerStarted","Data":"14901142bbe0cf9c3f5f739a21ffe8acf3688a095725823833475bf0b7a521cc"} Jan 21 14:46:30 crc kubenswrapper[4720]: I0121 14:46:30.217646 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8a6a2220-24c4-4a0b-b72e-848dbac6a14b","Type":"ContainerStarted","Data":"2abcef7bbdeb906aad91b06678021ef36456c0d3c8dd5748e9e075ac93d7913e"} Jan 21 14:46:30 crc kubenswrapper[4720]: I0121 14:46:30.218822 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"73c29d26-d7a2-40b5-81b8-ffda85c198d3","Type":"ContainerStarted","Data":"757461f9b0b16f1b1d3b7636456ed819d453bc36911002c98bf376775e84071e"} Jan 21 14:46:30 crc kubenswrapper[4720]: E0121 14:46:30.221191 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" podUID="7bf7a9dc-02fc-4976-afd7-2e172728b008" Jan 21 14:46:30 crc kubenswrapper[4720]: I0121 14:46:30.325948 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 21 14:46:30 crc kubenswrapper[4720]: I0121 14:46:30.333951 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 21 14:46:30 crc kubenswrapper[4720]: I0121 14:46:30.341207 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 21 14:46:30 crc kubenswrapper[4720]: I0121 14:46:30.596750 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 21 14:46:31 crc kubenswrapper[4720]: I0121 14:46:31.113588 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-2v7f2"] Jan 21 14:46:31 crc kubenswrapper[4720]: I0121 14:46:31.233817 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ab11441b-6bc4-4883-8a1e-866b31b425e9","Type":"ContainerStarted","Data":"6a3244b72bc0d2e8692db27fe44256e6ab79a8541de8947798ad30865a4efc75"} Jan 21 14:46:31 crc kubenswrapper[4720]: I0121 14:46:31.235799 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7","Type":"ContainerStarted","Data":"b5de03c99a86e921243af3619119b73c952c5f3ccc688bb6fd4a69b6fda32dd9"} Jan 21 14:46:31 crc kubenswrapper[4720]: I0121 14:46:31.236956 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3a2eafda-c352-4311-94d5-a1aec1422699","Type":"ContainerStarted","Data":"da6b6b430f12d2b56cf212530b8e484bf3b8d0da1c76e1f2c9cac8d57f6efdf2"} Jan 21 14:46:31 crc kubenswrapper[4720]: I0121 14:46:31.323927 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 21 14:46:32 crc kubenswrapper[4720]: W0121 14:46:32.352864 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8cf4740_b779_4759_92d1_22ce3e5f1369.slice/crio-feffe54f31a83065155ac0e2ce33fa53a57290884e776615eb2cca82e911bee9 WatchSource:0}: Error finding container feffe54f31a83065155ac0e2ce33fa53a57290884e776615eb2cca82e911bee9: Status 404 returned error can't find the container with id feffe54f31a83065155ac0e2ce33fa53a57290884e776615eb2cca82e911bee9 Jan 21 14:46:32 crc kubenswrapper[4720]: W0121 14:46:32.357019 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b833ac6_f279_4dfb_84fb_22b531e6b7ef.slice/crio-059be9a8e3e4e5cf81487a724d9df0847e2d6ddc23e8f512ca0fe2e54a3eeecc WatchSource:0}: Error finding container 059be9a8e3e4e5cf81487a724d9df0847e2d6ddc23e8f512ca0fe2e54a3eeecc: Status 404 returned error can't find the container with id 059be9a8e3e4e5cf81487a724d9df0847e2d6ddc23e8f512ca0fe2e54a3eeecc Jan 21 14:46:32 crc kubenswrapper[4720]: I0121 14:46:32.426437 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-56stq" Jan 21 14:46:32 crc kubenswrapper[4720]: I0121 14:46:32.438976 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-lvnwp" Jan 21 14:46:32 crc kubenswrapper[4720]: I0121 14:46:32.540099 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baa470a6-13e1-47a6-a036-d9a5bab976e6-config\") pod \"baa470a6-13e1-47a6-a036-d9a5bab976e6\" (UID: \"baa470a6-13e1-47a6-a036-d9a5bab976e6\") " Jan 21 14:46:32 crc kubenswrapper[4720]: I0121 14:46:32.540181 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kg5kp\" (UniqueName: \"kubernetes.io/projected/baa470a6-13e1-47a6-a036-d9a5bab976e6-kube-api-access-kg5kp\") pod \"baa470a6-13e1-47a6-a036-d9a5bab976e6\" (UID: \"baa470a6-13e1-47a6-a036-d9a5bab976e6\") " Jan 21 14:46:32 crc kubenswrapper[4720]: I0121 14:46:32.540251 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/baa470a6-13e1-47a6-a036-d9a5bab976e6-dns-svc\") pod \"baa470a6-13e1-47a6-a036-d9a5bab976e6\" (UID: \"baa470a6-13e1-47a6-a036-d9a5bab976e6\") " Jan 21 14:46:32 crc kubenswrapper[4720]: I0121 14:46:32.541115 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/baa470a6-13e1-47a6-a036-d9a5bab976e6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "baa470a6-13e1-47a6-a036-d9a5bab976e6" (UID: "baa470a6-13e1-47a6-a036-d9a5bab976e6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:46:32 crc kubenswrapper[4720]: I0121 14:46:32.541566 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/baa470a6-13e1-47a6-a036-d9a5bab976e6-config" (OuterVolumeSpecName: "config") pod "baa470a6-13e1-47a6-a036-d9a5bab976e6" (UID: "baa470a6-13e1-47a6-a036-d9a5bab976e6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:46:32 crc kubenswrapper[4720]: I0121 14:46:32.546823 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baa470a6-13e1-47a6-a036-d9a5bab976e6-kube-api-access-kg5kp" (OuterVolumeSpecName: "kube-api-access-kg5kp") pod "baa470a6-13e1-47a6-a036-d9a5bab976e6" (UID: "baa470a6-13e1-47a6-a036-d9a5bab976e6"). InnerVolumeSpecName "kube-api-access-kg5kp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:46:32 crc kubenswrapper[4720]: I0121 14:46:32.642063 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfgg5\" (UniqueName: \"kubernetes.io/projected/47e392d4-f48b-4079-afd3-a5d7fae209a8-kube-api-access-pfgg5\") pod \"47e392d4-f48b-4079-afd3-a5d7fae209a8\" (UID: \"47e392d4-f48b-4079-afd3-a5d7fae209a8\") " Jan 21 14:46:32 crc kubenswrapper[4720]: I0121 14:46:32.642249 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47e392d4-f48b-4079-afd3-a5d7fae209a8-config\") pod \"47e392d4-f48b-4079-afd3-a5d7fae209a8\" (UID: \"47e392d4-f48b-4079-afd3-a5d7fae209a8\") " Jan 21 14:46:32 crc kubenswrapper[4720]: I0121 14:46:32.642532 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baa470a6-13e1-47a6-a036-d9a5bab976e6-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:46:32 crc kubenswrapper[4720]: I0121 14:46:32.642547 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kg5kp\" (UniqueName: \"kubernetes.io/projected/baa470a6-13e1-47a6-a036-d9a5bab976e6-kube-api-access-kg5kp\") on node \"crc\" DevicePath \"\"" Jan 21 14:46:32 crc kubenswrapper[4720]: I0121 14:46:32.642558 4720 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/baa470a6-13e1-47a6-a036-d9a5bab976e6-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 14:46:32 crc kubenswrapper[4720]: I0121 14:46:32.642960 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47e392d4-f48b-4079-afd3-a5d7fae209a8-config" (OuterVolumeSpecName: "config") pod "47e392d4-f48b-4079-afd3-a5d7fae209a8" (UID: "47e392d4-f48b-4079-afd3-a5d7fae209a8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:46:32 crc kubenswrapper[4720]: I0121 14:46:32.658618 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47e392d4-f48b-4079-afd3-a5d7fae209a8-kube-api-access-pfgg5" (OuterVolumeSpecName: "kube-api-access-pfgg5") pod "47e392d4-f48b-4079-afd3-a5d7fae209a8" (UID: "47e392d4-f48b-4079-afd3-a5d7fae209a8"). InnerVolumeSpecName "kube-api-access-pfgg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:46:32 crc kubenswrapper[4720]: I0121 14:46:32.744477 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfgg5\" (UniqueName: \"kubernetes.io/projected/47e392d4-f48b-4079-afd3-a5d7fae209a8-kube-api-access-pfgg5\") on node \"crc\" DevicePath \"\"" Jan 21 14:46:32 crc kubenswrapper[4720]: I0121 14:46:32.744510 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47e392d4-f48b-4079-afd3-a5d7fae209a8-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:46:33 crc kubenswrapper[4720]: I0121 14:46:33.280305 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2v7f2" event={"ID":"04da7387-73aa-43e0-b547-7ce56e71d865","Type":"ContainerStarted","Data":"354eb4da79f832f710738ebccd702c93cde8a2bb019f171edf279baa7729d9ce"} Jan 21 14:46:33 crc kubenswrapper[4720]: I0121 14:46:33.284314 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4b833ac6-f279-4dfb-84fb-22b531e6b7ef","Type":"ContainerStarted","Data":"059be9a8e3e4e5cf81487a724d9df0847e2d6ddc23e8f512ca0fe2e54a3eeecc"} Jan 21 14:46:33 crc kubenswrapper[4720]: I0121 14:46:33.287050 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e8cf4740-b779-4759-92d1-22ce3e5f1369","Type":"ContainerStarted","Data":"feffe54f31a83065155ac0e2ce33fa53a57290884e776615eb2cca82e911bee9"} Jan 21 14:46:33 crc kubenswrapper[4720]: I0121 14:46:33.288234 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-lvnwp" event={"ID":"47e392d4-f48b-4079-afd3-a5d7fae209a8","Type":"ContainerDied","Data":"e472827db9139a7f614c97e83facefb5442c631000200b49113eeb8a3e5f4be5"} Jan 21 14:46:33 crc kubenswrapper[4720]: I0121 14:46:33.288291 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-lvnwp" Jan 21 14:46:33 crc kubenswrapper[4720]: I0121 14:46:33.290147 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-56stq" event={"ID":"baa470a6-13e1-47a6-a036-d9a5bab976e6","Type":"ContainerDied","Data":"66ff83cf99122294432f39dacae5d6e428cce0897ec84f2469a57dde751a1a7d"} Jan 21 14:46:33 crc kubenswrapper[4720]: I0121 14:46:33.290225 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-56stq" Jan 21 14:46:33 crc kubenswrapper[4720]: I0121 14:46:33.340066 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-lvnwp"] Jan 21 14:46:33 crc kubenswrapper[4720]: I0121 14:46:33.344449 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-lvnwp"] Jan 21 14:46:33 crc kubenswrapper[4720]: I0121 14:46:33.371791 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-56stq"] Jan 21 14:46:33 crc kubenswrapper[4720]: I0121 14:46:33.378092 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-56stq"] Jan 21 14:46:34 crc kubenswrapper[4720]: I0121 14:46:34.688478 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47e392d4-f48b-4079-afd3-a5d7fae209a8" path="/var/lib/kubelet/pods/47e392d4-f48b-4079-afd3-a5d7fae209a8/volumes" Jan 21 14:46:34 crc kubenswrapper[4720]: I0121 14:46:34.689303 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="baa470a6-13e1-47a6-a036-d9a5bab976e6" path="/var/lib/kubelet/pods/baa470a6-13e1-47a6-a036-d9a5bab976e6/volumes" Jan 21 14:46:38 crc kubenswrapper[4720]: I0121 14:46:38.330274 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"73c29d26-d7a2-40b5-81b8-ffda85c198d3","Type":"ContainerStarted","Data":"9d9afaeb9b65beb101a57bd12820454808396b6e2e005a99a7bcaadb7bb3e1ee"} Jan 21 14:46:38 crc kubenswrapper[4720]: I0121 14:46:38.330830 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Jan 21 14:46:38 crc kubenswrapper[4720]: I0121 14:46:38.358042 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=31.466588956 podStartE2EDuration="39.358021989s" podCreationTimestamp="2026-01-21 14:45:59 +0000 UTC" firstStartedPulling="2026-01-21 14:46:30.035228824 +0000 UTC m=+1027.943968756" lastFinishedPulling="2026-01-21 14:46:37.926661857 +0000 UTC m=+1035.835401789" observedRunningTime="2026-01-21 14:46:38.353461605 +0000 UTC m=+1036.262201527" watchObservedRunningTime="2026-01-21 14:46:38.358021989 +0000 UTC m=+1036.266761931" Jan 21 14:46:39 crc kubenswrapper[4720]: I0121 14:46:39.341877 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c1752995-abec-46de-adf8-da9e3ed99d4a","Type":"ContainerStarted","Data":"c805233f5325caf425e355c639bbb38416823bf3012c2a9fbf778f7b0bf437ea"} Jan 21 14:46:39 crc kubenswrapper[4720]: I0121 14:46:39.345540 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8a6a2220-24c4-4a0b-b72e-848dbac6a14b","Type":"ContainerStarted","Data":"6993f2212b44bdd39bce66195a79454199b52176f7d2a859e4057fd31875db5b"} Jan 21 14:46:39 crc kubenswrapper[4720]: I0121 14:46:39.347643 4720 generic.go:334] "Generic (PLEG): container finished" podID="04da7387-73aa-43e0-b547-7ce56e71d865" containerID="a5022c1f1b525d33e238d4482c9892406af087321aad8bd0ab2a7ea73cd4e288" exitCode=0 Jan 21 14:46:39 crc kubenswrapper[4720]: I0121 14:46:39.347713 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2v7f2" event={"ID":"04da7387-73aa-43e0-b547-7ce56e71d865","Type":"ContainerDied","Data":"a5022c1f1b525d33e238d4482c9892406af087321aad8bd0ab2a7ea73cd4e288"} Jan 21 14:46:39 crc kubenswrapper[4720]: I0121 14:46:39.350545 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e8cf4740-b779-4759-92d1-22ce3e5f1369","Type":"ContainerStarted","Data":"0b1e13a703fb2f04a9edb1e5f91e4500fd7bd28dfa40a775132461a8b5680c63"} Jan 21 14:46:39 crc kubenswrapper[4720]: I0121 14:46:39.352825 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3a2eafda-c352-4311-94d5-a1aec1422699","Type":"ContainerStarted","Data":"c4453d3c9ef59902e453daa4adc4cd400e16b0fd0ef2955bff89215fad4b9aed"} Jan 21 14:46:39 crc kubenswrapper[4720]: I0121 14:46:39.355067 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ab11441b-6bc4-4883-8a1e-866b31b425e9","Type":"ContainerStarted","Data":"35d070f6a12774abaa5a565105029112dff39f2c0ed4e97e33f5220ea3b359c1"} Jan 21 14:46:39 crc kubenswrapper[4720]: I0121 14:46:39.357216 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wpvzs" event={"ID":"95379233-3cd8-4dd3-bf0f-b8198f2258e1","Type":"ContainerStarted","Data":"4fbf8c1ee36ba5d1aef5429cac91c5f160551d648190b4fdd659ba7ceb48ae56"} Jan 21 14:46:39 crc kubenswrapper[4720]: I0121 14:46:39.357244 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:39 crc kubenswrapper[4720]: I0121 14:46:39.391146 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-wpvzs" podStartSLOduration=27.41627989 podStartE2EDuration="35.391125688s" podCreationTimestamp="2026-01-21 14:46:04 +0000 UTC" firstStartedPulling="2026-01-21 14:46:30.041396132 +0000 UTC m=+1027.950136064" lastFinishedPulling="2026-01-21 14:46:38.01624193 +0000 UTC m=+1035.924981862" observedRunningTime="2026-01-21 14:46:39.383287745 +0000 UTC m=+1037.292027697" watchObservedRunningTime="2026-01-21 14:46:39.391125688 +0000 UTC m=+1037.299865620" Jan 21 14:46:40 crc kubenswrapper[4720]: I0121 14:46:40.365612 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2v7f2" event={"ID":"04da7387-73aa-43e0-b547-7ce56e71d865","Type":"ContainerStarted","Data":"128dc883c1cb54d300554c0a1f9c402d25a7b42a58cf0cd66f72035cd7595489"} Jan 21 14:46:40 crc kubenswrapper[4720]: I0121 14:46:40.368607 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4b833ac6-f279-4dfb-84fb-22b531e6b7ef","Type":"ContainerStarted","Data":"c83330103379edd07e7e941627350e72fc565a47bf701190f1364a9ace4bad2d"} Jan 21 14:46:40 crc kubenswrapper[4720]: I0121 14:46:40.370926 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7","Type":"ContainerStarted","Data":"ebd6d3199ec576dfbbe1f0b22d6431e8b4bb04c2a4b40f4a23c829c2a972ee92"} Jan 21 14:46:40 crc kubenswrapper[4720]: I0121 14:46:40.397337 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=29.703317961 podStartE2EDuration="39.397316024s" podCreationTimestamp="2026-01-21 14:46:01 +0000 UTC" firstStartedPulling="2026-01-21 14:46:30.322866577 +0000 UTC m=+1028.231606509" lastFinishedPulling="2026-01-21 14:46:40.01686464 +0000 UTC m=+1037.925604572" observedRunningTime="2026-01-21 14:46:40.386954242 +0000 UTC m=+1038.295694204" watchObservedRunningTime="2026-01-21 14:46:40.397316024 +0000 UTC m=+1038.306055956" Jan 21 14:46:41 crc kubenswrapper[4720]: I0121 14:46:41.381818 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2v7f2" event={"ID":"04da7387-73aa-43e0-b547-7ce56e71d865","Type":"ContainerStarted","Data":"39abff7ca05557b976eb4f12ae00164a0fe932683d17f8b0b89f7c43d6852f4d"} Jan 21 14:46:41 crc kubenswrapper[4720]: I0121 14:46:41.381941 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 21 14:46:41 crc kubenswrapper[4720]: I0121 14:46:41.382250 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:41 crc kubenswrapper[4720]: I0121 14:46:41.382345 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:41 crc kubenswrapper[4720]: I0121 14:46:41.405085 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-2v7f2" podStartSLOduration=31.734498826 podStartE2EDuration="37.405069703s" podCreationTimestamp="2026-01-21 14:46:04 +0000 UTC" firstStartedPulling="2026-01-21 14:46:32.353453435 +0000 UTC m=+1030.262193367" lastFinishedPulling="2026-01-21 14:46:38.024024312 +0000 UTC m=+1035.932764244" observedRunningTime="2026-01-21 14:46:41.402828781 +0000 UTC m=+1039.311568733" watchObservedRunningTime="2026-01-21 14:46:41.405069703 +0000 UTC m=+1039.313809635" Jan 21 14:46:44 crc kubenswrapper[4720]: I0121 14:46:44.407283 4720 generic.go:334] "Generic (PLEG): container finished" podID="8a6a2220-24c4-4a0b-b72e-848dbac6a14b" containerID="6993f2212b44bdd39bce66195a79454199b52176f7d2a859e4057fd31875db5b" exitCode=0 Jan 21 14:46:44 crc kubenswrapper[4720]: I0121 14:46:44.407402 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8a6a2220-24c4-4a0b-b72e-848dbac6a14b","Type":"ContainerDied","Data":"6993f2212b44bdd39bce66195a79454199b52176f7d2a859e4057fd31875db5b"} Jan 21 14:46:45 crc kubenswrapper[4720]: I0121 14:46:45.260970 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Jan 21 14:46:45 crc kubenswrapper[4720]: I0121 14:46:45.429511 4720 generic.go:334] "Generic (PLEG): container finished" podID="ab11441b-6bc4-4883-8a1e-866b31b425e9" containerID="35d070f6a12774abaa5a565105029112dff39f2c0ed4e97e33f5220ea3b359c1" exitCode=0 Jan 21 14:46:45 crc kubenswrapper[4720]: I0121 14:46:45.430184 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ab11441b-6bc4-4883-8a1e-866b31b425e9","Type":"ContainerDied","Data":"35d070f6a12774abaa5a565105029112dff39f2c0ed4e97e33f5220ea3b359c1"} Jan 21 14:46:46 crc kubenswrapper[4720]: I0121 14:46:46.439021 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e8cf4740-b779-4759-92d1-22ce3e5f1369","Type":"ContainerStarted","Data":"ef22f61d71a363f2789f831afd2fde0d4f527560248ef9949ba2ef8cbf2285f9"} Jan 21 14:46:46 crc kubenswrapper[4720]: I0121 14:46:46.442114 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ab11441b-6bc4-4883-8a1e-866b31b425e9","Type":"ContainerStarted","Data":"f6a741dcb96fee8bc1a243c1147923a6ae85456e0394f00facd1588d9dab71be"} Jan 21 14:46:46 crc kubenswrapper[4720]: I0121 14:46:46.444502 4720 generic.go:334] "Generic (PLEG): container finished" podID="7bf7a9dc-02fc-4976-afd7-2e172728b008" containerID="0b786963c4e47855e06d100dbf56418f2e0998a3b22a8c8c54b9832d191b6f8a" exitCode=0 Jan 21 14:46:46 crc kubenswrapper[4720]: I0121 14:46:46.444536 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" event={"ID":"7bf7a9dc-02fc-4976-afd7-2e172728b008","Type":"ContainerDied","Data":"0b786963c4e47855e06d100dbf56418f2e0998a3b22a8c8c54b9832d191b6f8a"} Jan 21 14:46:46 crc kubenswrapper[4720]: I0121 14:46:46.448520 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8a6a2220-24c4-4a0b-b72e-848dbac6a14b","Type":"ContainerStarted","Data":"72be484776fdc013ccba2e85aaac2161a9ffdf86ec968878165385a82bab219e"} Jan 21 14:46:46 crc kubenswrapper[4720]: I0121 14:46:46.450617 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4b833ac6-f279-4dfb-84fb-22b531e6b7ef","Type":"ContainerStarted","Data":"dd2b0be046d44b565e237f62a26a48c9187b1871918bf42941523918804b0975"} Jan 21 14:46:46 crc kubenswrapper[4720]: I0121 14:46:46.452566 4720 generic.go:334] "Generic (PLEG): container finished" podID="7b076fbb-9c67-4e19-a9e6-1acb75a52cb8" containerID="483ca902ed4e5047641613f847b35ef340ea8ec0c446f87587d5090cf42d9a5f" exitCode=0 Jan 21 14:46:46 crc kubenswrapper[4720]: I0121 14:46:46.452624 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-l76kh" event={"ID":"7b076fbb-9c67-4e19-a9e6-1acb75a52cb8","Type":"ContainerDied","Data":"483ca902ed4e5047641613f847b35ef340ea8ec0c446f87587d5090cf42d9a5f"} Jan 21 14:46:46 crc kubenswrapper[4720]: I0121 14:46:46.461084 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:46 crc kubenswrapper[4720]: I0121 14:46:46.476533 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=28.446174559 podStartE2EDuration="41.476513535s" podCreationTimestamp="2026-01-21 14:46:05 +0000 UTC" firstStartedPulling="2026-01-21 14:46:32.361052771 +0000 UTC m=+1030.269792713" lastFinishedPulling="2026-01-21 14:46:45.391391757 +0000 UTC m=+1043.300131689" observedRunningTime="2026-01-21 14:46:46.472365472 +0000 UTC m=+1044.381105414" watchObservedRunningTime="2026-01-21 14:46:46.476513535 +0000 UTC m=+1044.385253477" Jan 21 14:46:46 crc kubenswrapper[4720]: I0121 14:46:46.518347 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:46 crc kubenswrapper[4720]: I0121 14:46:46.519808 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=41.820754639 podStartE2EDuration="49.519791585s" podCreationTimestamp="2026-01-21 14:45:57 +0000 UTC" firstStartedPulling="2026-01-21 14:46:30.325785827 +0000 UTC m=+1028.234525759" lastFinishedPulling="2026-01-21 14:46:38.024822753 +0000 UTC m=+1035.933562705" observedRunningTime="2026-01-21 14:46:46.518097168 +0000 UTC m=+1044.426837110" watchObservedRunningTime="2026-01-21 14:46:46.519791585 +0000 UTC m=+1044.428531517" Jan 21 14:46:46 crc kubenswrapper[4720]: I0121 14:46:46.579991 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=40.701506916 podStartE2EDuration="48.579968826s" podCreationTimestamp="2026-01-21 14:45:58 +0000 UTC" firstStartedPulling="2026-01-21 14:46:30.046186652 +0000 UTC m=+1027.954926584" lastFinishedPulling="2026-01-21 14:46:37.924648562 +0000 UTC m=+1035.833388494" observedRunningTime="2026-01-21 14:46:46.578300499 +0000 UTC m=+1044.487040441" watchObservedRunningTime="2026-01-21 14:46:46.579968826 +0000 UTC m=+1044.488708778" Jan 21 14:46:46 crc kubenswrapper[4720]: I0121 14:46:46.600732 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=24.622010803 podStartE2EDuration="37.600713411s" podCreationTimestamp="2026-01-21 14:46:09 +0000 UTC" firstStartedPulling="2026-01-21 14:46:32.360975559 +0000 UTC m=+1030.269715491" lastFinishedPulling="2026-01-21 14:46:45.339678167 +0000 UTC m=+1043.248418099" observedRunningTime="2026-01-21 14:46:46.594751508 +0000 UTC m=+1044.503491450" watchObservedRunningTime="2026-01-21 14:46:46.600713411 +0000 UTC m=+1044.509453353" Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.367481 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.462960 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-l76kh" event={"ID":"7b076fbb-9c67-4e19-a9e6-1acb75a52cb8","Type":"ContainerStarted","Data":"9fc576eecbdb3fedfe8e2d5fa8d5c27836b60d3c3256b4bc86a99f02229d670b"} Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.463171 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-l76kh" Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.465318 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" event={"ID":"7bf7a9dc-02fc-4976-afd7-2e172728b008","Type":"ContainerStarted","Data":"a6a9c62b53ee9551c9149db01fa9be1bd5bffd59ecb9c1581fb71dba03034285"} Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.465833 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.491605 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-l76kh" podStartSLOduration=3.692035401 podStartE2EDuration="52.491582903s" podCreationTimestamp="2026-01-21 14:45:55 +0000 UTC" firstStartedPulling="2026-01-21 14:45:56.606488604 +0000 UTC m=+994.515228536" lastFinishedPulling="2026-01-21 14:46:45.406036106 +0000 UTC m=+1043.314776038" observedRunningTime="2026-01-21 14:46:47.483939734 +0000 UTC m=+1045.392679686" watchObservedRunningTime="2026-01-21 14:46:47.491582903 +0000 UTC m=+1045.400322855" Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.501081 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" podStartSLOduration=7.37241547 podStartE2EDuration="51.501062731s" podCreationTimestamp="2026-01-21 14:45:56 +0000 UTC" firstStartedPulling="2026-01-21 14:46:01.277242991 +0000 UTC m=+999.185982923" lastFinishedPulling="2026-01-21 14:46:45.405890252 +0000 UTC m=+1043.314630184" observedRunningTime="2026-01-21 14:46:47.498673496 +0000 UTC m=+1045.407413428" watchObservedRunningTime="2026-01-21 14:46:47.501062731 +0000 UTC m=+1045.409802663" Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.515858 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.797000 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-nbfkp"] Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.826274 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-4xc4n"] Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.827702 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.830021 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.840499 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-4xc4n"] Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.922557 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-h55pf"] Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.926122 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-h55pf" Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.931968 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.945540 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-h55pf"] Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.946760 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92a2976e-a745-4fc4-ae87-355cf6defe5e-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-4xc4n\" (UID: \"92a2976e-a745-4fc4-ae87-355cf6defe5e\") " pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.946885 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92a2976e-a745-4fc4-ae87-355cf6defe5e-config\") pod \"dnsmasq-dns-7f896c8c65-4xc4n\" (UID: \"92a2976e-a745-4fc4-ae87-355cf6defe5e\") " pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.946983 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99xqg\" (UniqueName: \"kubernetes.io/projected/92a2976e-a745-4fc4-ae87-355cf6defe5e-kube-api-access-99xqg\") pod \"dnsmasq-dns-7f896c8c65-4xc4n\" (UID: \"92a2976e-a745-4fc4-ae87-355cf6defe5e\") " pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.947097 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92a2976e-a745-4fc4-ae87-355cf6defe5e-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-4xc4n\" (UID: \"92a2976e-a745-4fc4-ae87-355cf6defe5e\") " pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.048957 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brszh\" (UniqueName: \"kubernetes.io/projected/4fc0e40b-c337-42d2-87a3-2eedfa2f1a65-kube-api-access-brszh\") pod \"ovn-controller-metrics-h55pf\" (UID: \"4fc0e40b-c337-42d2-87a3-2eedfa2f1a65\") " pod="openstack/ovn-controller-metrics-h55pf" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.049063 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/4fc0e40b-c337-42d2-87a3-2eedfa2f1a65-ovs-rundir\") pod \"ovn-controller-metrics-h55pf\" (UID: \"4fc0e40b-c337-42d2-87a3-2eedfa2f1a65\") " pod="openstack/ovn-controller-metrics-h55pf" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.049167 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fc0e40b-c337-42d2-87a3-2eedfa2f1a65-combined-ca-bundle\") pod \"ovn-controller-metrics-h55pf\" (UID: \"4fc0e40b-c337-42d2-87a3-2eedfa2f1a65\") " pod="openstack/ovn-controller-metrics-h55pf" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.049208 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92a2976e-a745-4fc4-ae87-355cf6defe5e-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-4xc4n\" (UID: \"92a2976e-a745-4fc4-ae87-355cf6defe5e\") " pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.049288 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92a2976e-a745-4fc4-ae87-355cf6defe5e-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-4xc4n\" (UID: \"92a2976e-a745-4fc4-ae87-355cf6defe5e\") " pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.049325 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4fc0e40b-c337-42d2-87a3-2eedfa2f1a65-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-h55pf\" (UID: \"4fc0e40b-c337-42d2-87a3-2eedfa2f1a65\") " pod="openstack/ovn-controller-metrics-h55pf" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.049365 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fc0e40b-c337-42d2-87a3-2eedfa2f1a65-config\") pod \"ovn-controller-metrics-h55pf\" (UID: \"4fc0e40b-c337-42d2-87a3-2eedfa2f1a65\") " pod="openstack/ovn-controller-metrics-h55pf" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.049390 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/4fc0e40b-c337-42d2-87a3-2eedfa2f1a65-ovn-rundir\") pod \"ovn-controller-metrics-h55pf\" (UID: \"4fc0e40b-c337-42d2-87a3-2eedfa2f1a65\") " pod="openstack/ovn-controller-metrics-h55pf" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.049420 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92a2976e-a745-4fc4-ae87-355cf6defe5e-config\") pod \"dnsmasq-dns-7f896c8c65-4xc4n\" (UID: \"92a2976e-a745-4fc4-ae87-355cf6defe5e\") " pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.049560 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99xqg\" (UniqueName: \"kubernetes.io/projected/92a2976e-a745-4fc4-ae87-355cf6defe5e-kube-api-access-99xqg\") pod \"dnsmasq-dns-7f896c8c65-4xc4n\" (UID: \"92a2976e-a745-4fc4-ae87-355cf6defe5e\") " pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.050409 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92a2976e-a745-4fc4-ae87-355cf6defe5e-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-4xc4n\" (UID: \"92a2976e-a745-4fc4-ae87-355cf6defe5e\") " pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.050453 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92a2976e-a745-4fc4-ae87-355cf6defe5e-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-4xc4n\" (UID: \"92a2976e-a745-4fc4-ae87-355cf6defe5e\") " pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.051039 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92a2976e-a745-4fc4-ae87-355cf6defe5e-config\") pod \"dnsmasq-dns-7f896c8c65-4xc4n\" (UID: \"92a2976e-a745-4fc4-ae87-355cf6defe5e\") " pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.068296 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99xqg\" (UniqueName: \"kubernetes.io/projected/92a2976e-a745-4fc4-ae87-355cf6defe5e-kube-api-access-99xqg\") pod \"dnsmasq-dns-7f896c8c65-4xc4n\" (UID: \"92a2976e-a745-4fc4-ae87-355cf6defe5e\") " pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.107175 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-l76kh"] Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.137195 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-r6bj4"] Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.138335 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.142945 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.147219 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.155544 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brszh\" (UniqueName: \"kubernetes.io/projected/4fc0e40b-c337-42d2-87a3-2eedfa2f1a65-kube-api-access-brszh\") pod \"ovn-controller-metrics-h55pf\" (UID: \"4fc0e40b-c337-42d2-87a3-2eedfa2f1a65\") " pod="openstack/ovn-controller-metrics-h55pf" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.155816 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/4fc0e40b-c337-42d2-87a3-2eedfa2f1a65-ovs-rundir\") pod \"ovn-controller-metrics-h55pf\" (UID: \"4fc0e40b-c337-42d2-87a3-2eedfa2f1a65\") " pod="openstack/ovn-controller-metrics-h55pf" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.155834 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fc0e40b-c337-42d2-87a3-2eedfa2f1a65-combined-ca-bundle\") pod \"ovn-controller-metrics-h55pf\" (UID: \"4fc0e40b-c337-42d2-87a3-2eedfa2f1a65\") " pod="openstack/ovn-controller-metrics-h55pf" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.155895 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4fc0e40b-c337-42d2-87a3-2eedfa2f1a65-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-h55pf\" (UID: \"4fc0e40b-c337-42d2-87a3-2eedfa2f1a65\") " pod="openstack/ovn-controller-metrics-h55pf" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.155917 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fc0e40b-c337-42d2-87a3-2eedfa2f1a65-config\") pod \"ovn-controller-metrics-h55pf\" (UID: \"4fc0e40b-c337-42d2-87a3-2eedfa2f1a65\") " pod="openstack/ovn-controller-metrics-h55pf" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.155937 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/4fc0e40b-c337-42d2-87a3-2eedfa2f1a65-ovn-rundir\") pod \"ovn-controller-metrics-h55pf\" (UID: \"4fc0e40b-c337-42d2-87a3-2eedfa2f1a65\") " pod="openstack/ovn-controller-metrics-h55pf" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.156162 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/4fc0e40b-c337-42d2-87a3-2eedfa2f1a65-ovn-rundir\") pod \"ovn-controller-metrics-h55pf\" (UID: \"4fc0e40b-c337-42d2-87a3-2eedfa2f1a65\") " pod="openstack/ovn-controller-metrics-h55pf" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.156164 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/4fc0e40b-c337-42d2-87a3-2eedfa2f1a65-ovs-rundir\") pod \"ovn-controller-metrics-h55pf\" (UID: \"4fc0e40b-c337-42d2-87a3-2eedfa2f1a65\") " pod="openstack/ovn-controller-metrics-h55pf" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.156640 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fc0e40b-c337-42d2-87a3-2eedfa2f1a65-config\") pod \"ovn-controller-metrics-h55pf\" (UID: \"4fc0e40b-c337-42d2-87a3-2eedfa2f1a65\") " pod="openstack/ovn-controller-metrics-h55pf" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.162123 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fc0e40b-c337-42d2-87a3-2eedfa2f1a65-combined-ca-bundle\") pod \"ovn-controller-metrics-h55pf\" (UID: \"4fc0e40b-c337-42d2-87a3-2eedfa2f1a65\") " pod="openstack/ovn-controller-metrics-h55pf" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.171068 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-r6bj4"] Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.204337 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4fc0e40b-c337-42d2-87a3-2eedfa2f1a65-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-h55pf\" (UID: \"4fc0e40b-c337-42d2-87a3-2eedfa2f1a65\") " pod="openstack/ovn-controller-metrics-h55pf" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.212605 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brszh\" (UniqueName: \"kubernetes.io/projected/4fc0e40b-c337-42d2-87a3-2eedfa2f1a65-kube-api-access-brszh\") pod \"ovn-controller-metrics-h55pf\" (UID: \"4fc0e40b-c337-42d2-87a3-2eedfa2f1a65\") " pod="openstack/ovn-controller-metrics-h55pf" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.256040 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-h55pf" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.257340 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-r6bj4\" (UID: \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\") " pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.257479 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-r6bj4\" (UID: \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\") " pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.257580 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-r6bj4\" (UID: \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\") " pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.257681 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9bw8\" (UniqueName: \"kubernetes.io/projected/c819d03b-78e1-470e-96dc-6144aa8e8f5a-kube-api-access-w9bw8\") pod \"dnsmasq-dns-86db49b7ff-r6bj4\" (UID: \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\") " pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.257760 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-config\") pod \"dnsmasq-dns-86db49b7ff-r6bj4\" (UID: \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\") " pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.361948 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-r6bj4\" (UID: \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\") " pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.362160 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-r6bj4\" (UID: \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\") " pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.362244 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9bw8\" (UniqueName: \"kubernetes.io/projected/c819d03b-78e1-470e-96dc-6144aa8e8f5a-kube-api-access-w9bw8\") pod \"dnsmasq-dns-86db49b7ff-r6bj4\" (UID: \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\") " pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.367835 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-config\") pod \"dnsmasq-dns-86db49b7ff-r6bj4\" (UID: \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\") " pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.365892 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-r6bj4\" (UID: \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\") " pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.373999 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-config\") pod \"dnsmasq-dns-86db49b7ff-r6bj4\" (UID: \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\") " pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.365109 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-r6bj4\" (UID: \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\") " pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.374186 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-r6bj4\" (UID: \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\") " pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.375058 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-r6bj4\" (UID: \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\") " pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.389018 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9bw8\" (UniqueName: \"kubernetes.io/projected/c819d03b-78e1-470e-96dc-6144aa8e8f5a-kube-api-access-w9bw8\") pod \"dnsmasq-dns-86db49b7ff-r6bj4\" (UID: \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\") " pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.475079 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" podUID="7bf7a9dc-02fc-4976-afd7-2e172728b008" containerName="dnsmasq-dns" containerID="cri-o://a6a9c62b53ee9551c9149db01fa9be1bd5bffd59ecb9c1581fb71dba03034285" gracePeriod=10 Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.475923 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.586874 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.712540 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.712582 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.368136 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.412739 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.484469 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-l76kh" podUID="7b076fbb-9c67-4e19-a9e6-1acb75a52cb8" containerName="dnsmasq-dns" containerID="cri-o://9fc576eecbdb3fedfe8e2d5fa8d5c27836b60d3c3256b4bc86a99f02229d670b" gracePeriod=10 Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.526301 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.682287 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.684040 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.689811 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.690076 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.690203 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-mzpmv" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.693010 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.701011 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.808729 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/262f8354-3f7b-483f-940d-8b0f394e344a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.809172 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/262f8354-3f7b-483f-940d-8b0f394e344a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.809276 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/262f8354-3f7b-483f-940d-8b0f394e344a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.809407 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/262f8354-3f7b-483f-940d-8b0f394e344a-config\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.809618 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/262f8354-3f7b-483f-940d-8b0f394e344a-scripts\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.809668 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8cp5\" (UniqueName: \"kubernetes.io/projected/262f8354-3f7b-483f-940d-8b0f394e344a-kube-api-access-q8cp5\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.809852 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/262f8354-3f7b-483f-940d-8b0f394e344a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.896478 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.896543 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.911623 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/262f8354-3f7b-483f-940d-8b0f394e344a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.911714 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/262f8354-3f7b-483f-940d-8b0f394e344a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.911757 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/262f8354-3f7b-483f-940d-8b0f394e344a-config\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.911822 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/262f8354-3f7b-483f-940d-8b0f394e344a-scripts\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.911843 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8cp5\" (UniqueName: \"kubernetes.io/projected/262f8354-3f7b-483f-940d-8b0f394e344a-kube-api-access-q8cp5\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.911934 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/262f8354-3f7b-483f-940d-8b0f394e344a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.911966 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/262f8354-3f7b-483f-940d-8b0f394e344a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.914327 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/262f8354-3f7b-483f-940d-8b0f394e344a-config\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.914424 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/262f8354-3f7b-483f-940d-8b0f394e344a-scripts\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.914478 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/262f8354-3f7b-483f-940d-8b0f394e344a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.928485 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/262f8354-3f7b-483f-940d-8b0f394e344a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.931469 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/262f8354-3f7b-483f-940d-8b0f394e344a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.933032 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/262f8354-3f7b-483f-940d-8b0f394e344a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.934138 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8cp5\" (UniqueName: \"kubernetes.io/projected/262f8354-3f7b-483f-940d-8b0f394e344a-kube-api-access-q8cp5\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:50 crc kubenswrapper[4720]: I0121 14:46:50.009820 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 21 14:46:50 crc kubenswrapper[4720]: I0121 14:46:50.155461 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-4xc4n"] Jan 21 14:46:50 crc kubenswrapper[4720]: I0121 14:46:50.253502 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-r6bj4"] Jan 21 14:46:50 crc kubenswrapper[4720]: I0121 14:46:50.261951 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-h55pf"] Jan 21 14:46:50 crc kubenswrapper[4720]: W0121 14:46:50.269478 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fc0e40b_c337_42d2_87a3_2eedfa2f1a65.slice/crio-6ac968bf36e1eff3fb9de52c1805b1bde6084d708798b0864aa7f2930cfcc2dd WatchSource:0}: Error finding container 6ac968bf36e1eff3fb9de52c1805b1bde6084d708798b0864aa7f2930cfcc2dd: Status 404 returned error can't find the container with id 6ac968bf36e1eff3fb9de52c1805b1bde6084d708798b0864aa7f2930cfcc2dd Jan 21 14:46:50 crc kubenswrapper[4720]: I0121 14:46:50.494854 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 21 14:46:50 crc kubenswrapper[4720]: I0121 14:46:50.500239 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" event={"ID":"c819d03b-78e1-470e-96dc-6144aa8e8f5a","Type":"ContainerStarted","Data":"796d2687528fd87d25dd4fc1a5f89808d76b284cc0b9360ef63068e7663548e8"} Jan 21 14:46:50 crc kubenswrapper[4720]: I0121 14:46:50.501485 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-h55pf" event={"ID":"4fc0e40b-c337-42d2-87a3-2eedfa2f1a65","Type":"ContainerStarted","Data":"6ac968bf36e1eff3fb9de52c1805b1bde6084d708798b0864aa7f2930cfcc2dd"} Jan 21 14:46:50 crc kubenswrapper[4720]: I0121 14:46:50.502983 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" event={"ID":"92a2976e-a745-4fc4-ae87-355cf6defe5e","Type":"ContainerStarted","Data":"5c2c6c7764adf382494a9f5fec6ff894d74ac4a2178d6e2c114761cba32aec98"} Jan 21 14:46:50 crc kubenswrapper[4720]: W0121 14:46:50.504212 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod262f8354_3f7b_483f_940d_8b0f394e344a.slice/crio-a411bc55d5a45113a5bfcdc112c0b9e66956bcec653b4665e16624b4de042745 WatchSource:0}: Error finding container a411bc55d5a45113a5bfcdc112c0b9e66956bcec653b4665e16624b4de042745: Status 404 returned error can't find the container with id a411bc55d5a45113a5bfcdc112c0b9e66956bcec653b4665e16624b4de042745 Jan 21 14:46:51 crc kubenswrapper[4720]: I0121 14:46:51.015014 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-666b6646f7-l76kh" podUID="7b076fbb-9c67-4e19-a9e6-1acb75a52cb8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.97:5353: connect: connection refused" Jan 21 14:46:51 crc kubenswrapper[4720]: I0121 14:46:51.374739 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" podUID="7bf7a9dc-02fc-4976-afd7-2e172728b008" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.98:5353: connect: connection refused" Jan 21 14:46:51 crc kubenswrapper[4720]: I0121 14:46:51.580403 4720 generic.go:334] "Generic (PLEG): container finished" podID="7bf7a9dc-02fc-4976-afd7-2e172728b008" containerID="a6a9c62b53ee9551c9149db01fa9be1bd5bffd59ecb9c1581fb71dba03034285" exitCode=0 Jan 21 14:46:51 crc kubenswrapper[4720]: I0121 14:46:51.580516 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" event={"ID":"7bf7a9dc-02fc-4976-afd7-2e172728b008","Type":"ContainerDied","Data":"a6a9c62b53ee9551c9149db01fa9be1bd5bffd59ecb9c1581fb71dba03034285"} Jan 21 14:46:51 crc kubenswrapper[4720]: I0121 14:46:51.598628 4720 generic.go:334] "Generic (PLEG): container finished" podID="c819d03b-78e1-470e-96dc-6144aa8e8f5a" containerID="ce16ebb9a67a679cad4040701c2e535eabfd75f649979c91f4ea8e8bc1b64f6b" exitCode=0 Jan 21 14:46:51 crc kubenswrapper[4720]: I0121 14:46:51.598720 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" event={"ID":"c819d03b-78e1-470e-96dc-6144aa8e8f5a","Type":"ContainerDied","Data":"ce16ebb9a67a679cad4040701c2e535eabfd75f649979c91f4ea8e8bc1b64f6b"} Jan 21 14:46:51 crc kubenswrapper[4720]: I0121 14:46:51.633900 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-h55pf" event={"ID":"4fc0e40b-c337-42d2-87a3-2eedfa2f1a65","Type":"ContainerStarted","Data":"fa9f95f15aae289dbfc14f2cb29da9f352c35627be5ad7715617cf9c24a323e5"} Jan 21 14:46:51 crc kubenswrapper[4720]: I0121 14:46:51.649474 4720 generic.go:334] "Generic (PLEG): container finished" podID="7b076fbb-9c67-4e19-a9e6-1acb75a52cb8" containerID="9fc576eecbdb3fedfe8e2d5fa8d5c27836b60d3c3256b4bc86a99f02229d670b" exitCode=0 Jan 21 14:46:51 crc kubenswrapper[4720]: I0121 14:46:51.649532 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-l76kh" event={"ID":"7b076fbb-9c67-4e19-a9e6-1acb75a52cb8","Type":"ContainerDied","Data":"9fc576eecbdb3fedfe8e2d5fa8d5c27836b60d3c3256b4bc86a99f02229d670b"} Jan 21 14:46:51 crc kubenswrapper[4720]: I0121 14:46:51.650799 4720 generic.go:334] "Generic (PLEG): container finished" podID="92a2976e-a745-4fc4-ae87-355cf6defe5e" containerID="6f7786d2264d71ca7c2f10284d3bfafeed9e1702b1b9a1588ef1bbc0db72c906" exitCode=0 Jan 21 14:46:51 crc kubenswrapper[4720]: I0121 14:46:51.650831 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" event={"ID":"92a2976e-a745-4fc4-ae87-355cf6defe5e","Type":"ContainerDied","Data":"6f7786d2264d71ca7c2f10284d3bfafeed9e1702b1b9a1588ef1bbc0db72c906"} Jan 21 14:46:51 crc kubenswrapper[4720]: I0121 14:46:51.674249 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"262f8354-3f7b-483f-940d-8b0f394e344a","Type":"ContainerStarted","Data":"a411bc55d5a45113a5bfcdc112c0b9e66956bcec653b4665e16624b4de042745"} Jan 21 14:46:51 crc kubenswrapper[4720]: I0121 14:46:51.732982 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-h55pf" podStartSLOduration=4.732965662 podStartE2EDuration="4.732965662s" podCreationTimestamp="2026-01-21 14:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:46:51.690401701 +0000 UTC m=+1049.599141633" watchObservedRunningTime="2026-01-21 14:46:51.732965662 +0000 UTC m=+1049.641705594" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.074207 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.284627 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-l76kh" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.307299 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.382704 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t64nn\" (UniqueName: \"kubernetes.io/projected/7bf7a9dc-02fc-4976-afd7-2e172728b008-kube-api-access-t64nn\") pod \"7bf7a9dc-02fc-4976-afd7-2e172728b008\" (UID: \"7bf7a9dc-02fc-4976-afd7-2e172728b008\") " Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.382803 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7bf7a9dc-02fc-4976-afd7-2e172728b008-dns-svc\") pod \"7bf7a9dc-02fc-4976-afd7-2e172728b008\" (UID: \"7bf7a9dc-02fc-4976-afd7-2e172728b008\") " Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.382863 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b076fbb-9c67-4e19-a9e6-1acb75a52cb8-dns-svc\") pod \"7b076fbb-9c67-4e19-a9e6-1acb75a52cb8\" (UID: \"7b076fbb-9c67-4e19-a9e6-1acb75a52cb8\") " Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.382883 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b076fbb-9c67-4e19-a9e6-1acb75a52cb8-config\") pod \"7b076fbb-9c67-4e19-a9e6-1acb75a52cb8\" (UID: \"7b076fbb-9c67-4e19-a9e6-1acb75a52cb8\") " Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.383473 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfjd2\" (UniqueName: \"kubernetes.io/projected/7b076fbb-9c67-4e19-a9e6-1acb75a52cb8-kube-api-access-gfjd2\") pod \"7b076fbb-9c67-4e19-a9e6-1acb75a52cb8\" (UID: \"7b076fbb-9c67-4e19-a9e6-1acb75a52cb8\") " Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.383576 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bf7a9dc-02fc-4976-afd7-2e172728b008-config\") pod \"7bf7a9dc-02fc-4976-afd7-2e172728b008\" (UID: \"7bf7a9dc-02fc-4976-afd7-2e172728b008\") " Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.397352 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b076fbb-9c67-4e19-a9e6-1acb75a52cb8-kube-api-access-gfjd2" (OuterVolumeSpecName: "kube-api-access-gfjd2") pod "7b076fbb-9c67-4e19-a9e6-1acb75a52cb8" (UID: "7b076fbb-9c67-4e19-a9e6-1acb75a52cb8"). InnerVolumeSpecName "kube-api-access-gfjd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.397637 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bf7a9dc-02fc-4976-afd7-2e172728b008-kube-api-access-t64nn" (OuterVolumeSpecName: "kube-api-access-t64nn") pod "7bf7a9dc-02fc-4976-afd7-2e172728b008" (UID: "7bf7a9dc-02fc-4976-afd7-2e172728b008"). InnerVolumeSpecName "kube-api-access-t64nn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.438615 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bf7a9dc-02fc-4976-afd7-2e172728b008-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7bf7a9dc-02fc-4976-afd7-2e172728b008" (UID: "7bf7a9dc-02fc-4976-afd7-2e172728b008"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.443510 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bf7a9dc-02fc-4976-afd7-2e172728b008-config" (OuterVolumeSpecName: "config") pod "7bf7a9dc-02fc-4976-afd7-2e172728b008" (UID: "7bf7a9dc-02fc-4976-afd7-2e172728b008"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.461313 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b076fbb-9c67-4e19-a9e6-1acb75a52cb8-config" (OuterVolumeSpecName: "config") pod "7b076fbb-9c67-4e19-a9e6-1acb75a52cb8" (UID: "7b076fbb-9c67-4e19-a9e6-1acb75a52cb8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.470204 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b076fbb-9c67-4e19-a9e6-1acb75a52cb8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7b076fbb-9c67-4e19-a9e6-1acb75a52cb8" (UID: "7b076fbb-9c67-4e19-a9e6-1acb75a52cb8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.486830 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t64nn\" (UniqueName: \"kubernetes.io/projected/7bf7a9dc-02fc-4976-afd7-2e172728b008-kube-api-access-t64nn\") on node \"crc\" DevicePath \"\"" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.486870 4720 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7bf7a9dc-02fc-4976-afd7-2e172728b008-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.486880 4720 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b076fbb-9c67-4e19-a9e6-1acb75a52cb8-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.486888 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b076fbb-9c67-4e19-a9e6-1acb75a52cb8-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.486898 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfjd2\" (UniqueName: \"kubernetes.io/projected/7b076fbb-9c67-4e19-a9e6-1acb75a52cb8-kube-api-access-gfjd2\") on node \"crc\" DevicePath \"\"" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.486905 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bf7a9dc-02fc-4976-afd7-2e172728b008-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.685098 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.690480 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-l76kh" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.696686 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" event={"ID":"7bf7a9dc-02fc-4976-afd7-2e172728b008","Type":"ContainerDied","Data":"6650a88dae698abc02e358a7f68ed4173c431f0402a2eee9296a8d8cc7459c5d"} Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.696722 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" event={"ID":"c819d03b-78e1-470e-96dc-6144aa8e8f5a","Type":"ContainerStarted","Data":"2833ba66e8cdea183ed0eb3bc5ec831775fbf6f4d81eeb54b6a98664af721cfc"} Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.696771 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-l76kh" event={"ID":"7b076fbb-9c67-4e19-a9e6-1acb75a52cb8","Type":"ContainerDied","Data":"595d585b971e43804af671995b49cfdd137a2f109ac81b7c395b8867814a6871"} Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.697725 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.697791 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.697811 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" event={"ID":"92a2976e-a745-4fc4-ae87-355cf6defe5e","Type":"ContainerStarted","Data":"776dc9c8416f9801617b3484ca9e8cd1cc62e694f35849ce17876a8aa7e06196"} Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.697920 4720 scope.go:117] "RemoveContainer" containerID="a6a9c62b53ee9551c9149db01fa9be1bd5bffd59ecb9c1581fb71dba03034285" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.730397 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" podStartSLOduration=5.730380808 podStartE2EDuration="5.730380808s" podCreationTimestamp="2026-01-21 14:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:46:52.729451043 +0000 UTC m=+1050.638190995" watchObservedRunningTime="2026-01-21 14:46:52.730380808 +0000 UTC m=+1050.639120750" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.752400 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-nbfkp"] Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.754564 4720 scope.go:117] "RemoveContainer" containerID="0b786963c4e47855e06d100dbf56418f2e0998a3b22a8c8c54b9832d191b6f8a" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.778345 4720 scope.go:117] "RemoveContainer" containerID="9fc576eecbdb3fedfe8e2d5fa8d5c27836b60d3c3256b4bc86a99f02229d670b" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.802909 4720 scope.go:117] "RemoveContainer" containerID="483ca902ed4e5047641613f847b35ef340ea8ec0c446f87587d5090cf42d9a5f" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.804356 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-nbfkp"] Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.806106 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" podStartSLOduration=4.806091402 podStartE2EDuration="4.806091402s" podCreationTimestamp="2026-01-21 14:46:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:46:52.765114485 +0000 UTC m=+1050.673854427" watchObservedRunningTime="2026-01-21 14:46:52.806091402 +0000 UTC m=+1050.714831324" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.818092 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-l76kh"] Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.823863 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-l76kh"] Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.881127 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.881174 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.881213 4720 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.881771 4720 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"533cdaf61eeca84a9c75ff12c4bc63c6833cac28437ed5151fede2f9b5a4f6a6"} pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.881821 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" containerID="cri-o://533cdaf61eeca84a9c75ff12c4bc63c6833cac28437ed5151fede2f9b5a4f6a6" gracePeriod=600 Jan 21 14:46:53 crc kubenswrapper[4720]: I0121 14:46:53.707384 4720 generic.go:334] "Generic (PLEG): container finished" podID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerID="533cdaf61eeca84a9c75ff12c4bc63c6833cac28437ed5151fede2f9b5a4f6a6" exitCode=0 Jan 21 14:46:53 crc kubenswrapper[4720]: I0121 14:46:53.707445 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerDied","Data":"533cdaf61eeca84a9c75ff12c4bc63c6833cac28437ed5151fede2f9b5a4f6a6"} Jan 21 14:46:53 crc kubenswrapper[4720]: I0121 14:46:53.707737 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerStarted","Data":"c955510d9d72215d99901afe6e11ff00ee6cb8f0d5290256bae37e29e3631aa6"} Jan 21 14:46:53 crc kubenswrapper[4720]: I0121 14:46:53.707757 4720 scope.go:117] "RemoveContainer" containerID="75aaa3118f909741ad221a6d4a71b9c6e4e33b0de93fc4cf721b556711ea2c47" Jan 21 14:46:53 crc kubenswrapper[4720]: I0121 14:46:53.714215 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"262f8354-3f7b-483f-940d-8b0f394e344a","Type":"ContainerStarted","Data":"815adb0d9349b00edb50d5383a97b04d2ea58b47183d7b8c549b2f06e6736cf2"} Jan 21 14:46:53 crc kubenswrapper[4720]: I0121 14:46:53.714271 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"262f8354-3f7b-483f-940d-8b0f394e344a","Type":"ContainerStarted","Data":"825c39cc72d95021f48701e28220e0cc1b60e6d64d4e16d91eca4ddce4e14ddf"} Jan 21 14:46:53 crc kubenswrapper[4720]: I0121 14:46:53.714297 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Jan 21 14:46:53 crc kubenswrapper[4720]: I0121 14:46:53.754395 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.505535751 podStartE2EDuration="4.75437442s" podCreationTimestamp="2026-01-21 14:46:49 +0000 UTC" firstStartedPulling="2026-01-21 14:46:50.506389007 +0000 UTC m=+1048.415128939" lastFinishedPulling="2026-01-21 14:46:52.755227676 +0000 UTC m=+1050.663967608" observedRunningTime="2026-01-21 14:46:53.746222317 +0000 UTC m=+1051.654962269" watchObservedRunningTime="2026-01-21 14:46:53.75437442 +0000 UTC m=+1051.663114362" Jan 21 14:46:54 crc kubenswrapper[4720]: I0121 14:46:54.533390 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Jan 21 14:46:54 crc kubenswrapper[4720]: I0121 14:46:54.614403 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Jan 21 14:46:54 crc kubenswrapper[4720]: I0121 14:46:54.696914 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b076fbb-9c67-4e19-a9e6-1acb75a52cb8" path="/var/lib/kubelet/pods/7b076fbb-9c67-4e19-a9e6-1acb75a52cb8/volumes" Jan 21 14:46:54 crc kubenswrapper[4720]: I0121 14:46:54.698179 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bf7a9dc-02fc-4976-afd7-2e172728b008" path="/var/lib/kubelet/pods/7bf7a9dc-02fc-4976-afd7-2e172728b008/volumes" Jan 21 14:46:54 crc kubenswrapper[4720]: I0121 14:46:54.800586 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Jan 21 14:46:54 crc kubenswrapper[4720]: I0121 14:46:54.870522 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.532463 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-372a-account-create-update-w4xkf"] Jan 21 14:46:55 crc kubenswrapper[4720]: E0121 14:46:55.533058 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bf7a9dc-02fc-4976-afd7-2e172728b008" containerName="dnsmasq-dns" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.533139 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bf7a9dc-02fc-4976-afd7-2e172728b008" containerName="dnsmasq-dns" Jan 21 14:46:55 crc kubenswrapper[4720]: E0121 14:46:55.533208 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b076fbb-9c67-4e19-a9e6-1acb75a52cb8" containerName="init" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.533272 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b076fbb-9c67-4e19-a9e6-1acb75a52cb8" containerName="init" Jan 21 14:46:55 crc kubenswrapper[4720]: E0121 14:46:55.533337 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bf7a9dc-02fc-4976-afd7-2e172728b008" containerName="init" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.533387 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bf7a9dc-02fc-4976-afd7-2e172728b008" containerName="init" Jan 21 14:46:55 crc kubenswrapper[4720]: E0121 14:46:55.533458 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b076fbb-9c67-4e19-a9e6-1acb75a52cb8" containerName="dnsmasq-dns" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.533514 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b076fbb-9c67-4e19-a9e6-1acb75a52cb8" containerName="dnsmasq-dns" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.533721 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bf7a9dc-02fc-4976-afd7-2e172728b008" containerName="dnsmasq-dns" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.533797 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b076fbb-9c67-4e19-a9e6-1acb75a52cb8" containerName="dnsmasq-dns" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.534371 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-372a-account-create-update-w4xkf" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.536715 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.541107 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-ckgkh"] Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.550818 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ckgkh" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.569509 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-ckgkh"] Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.578401 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-372a-account-create-update-w4xkf"] Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.637157 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j27h\" (UniqueName: \"kubernetes.io/projected/b4bb55ed-9214-4f25-8740-ac50421baa4b-kube-api-access-2j27h\") pod \"glance-372a-account-create-update-w4xkf\" (UID: \"b4bb55ed-9214-4f25-8740-ac50421baa4b\") " pod="openstack/glance-372a-account-create-update-w4xkf" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.637429 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4bb55ed-9214-4f25-8740-ac50421baa4b-operator-scripts\") pod \"glance-372a-account-create-update-w4xkf\" (UID: \"b4bb55ed-9214-4f25-8740-ac50421baa4b\") " pod="openstack/glance-372a-account-create-update-w4xkf" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.739081 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d0385ad-a123-4c46-a96f-652dee1f89cd-operator-scripts\") pod \"glance-db-create-ckgkh\" (UID: \"0d0385ad-a123-4c46-a96f-652dee1f89cd\") " pod="openstack/glance-db-create-ckgkh" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.739171 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j27h\" (UniqueName: \"kubernetes.io/projected/b4bb55ed-9214-4f25-8740-ac50421baa4b-kube-api-access-2j27h\") pod \"glance-372a-account-create-update-w4xkf\" (UID: \"b4bb55ed-9214-4f25-8740-ac50421baa4b\") " pod="openstack/glance-372a-account-create-update-w4xkf" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.739206 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4bb55ed-9214-4f25-8740-ac50421baa4b-operator-scripts\") pod \"glance-372a-account-create-update-w4xkf\" (UID: \"b4bb55ed-9214-4f25-8740-ac50421baa4b\") " pod="openstack/glance-372a-account-create-update-w4xkf" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.739282 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8268\" (UniqueName: \"kubernetes.io/projected/0d0385ad-a123-4c46-a96f-652dee1f89cd-kube-api-access-q8268\") pod \"glance-db-create-ckgkh\" (UID: \"0d0385ad-a123-4c46-a96f-652dee1f89cd\") " pod="openstack/glance-db-create-ckgkh" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.740084 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4bb55ed-9214-4f25-8740-ac50421baa4b-operator-scripts\") pod \"glance-372a-account-create-update-w4xkf\" (UID: \"b4bb55ed-9214-4f25-8740-ac50421baa4b\") " pod="openstack/glance-372a-account-create-update-w4xkf" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.759301 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j27h\" (UniqueName: \"kubernetes.io/projected/b4bb55ed-9214-4f25-8740-ac50421baa4b-kube-api-access-2j27h\") pod \"glance-372a-account-create-update-w4xkf\" (UID: \"b4bb55ed-9214-4f25-8740-ac50421baa4b\") " pod="openstack/glance-372a-account-create-update-w4xkf" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.841215 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8268\" (UniqueName: \"kubernetes.io/projected/0d0385ad-a123-4c46-a96f-652dee1f89cd-kube-api-access-q8268\") pod \"glance-db-create-ckgkh\" (UID: \"0d0385ad-a123-4c46-a96f-652dee1f89cd\") " pod="openstack/glance-db-create-ckgkh" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.841446 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d0385ad-a123-4c46-a96f-652dee1f89cd-operator-scripts\") pod \"glance-db-create-ckgkh\" (UID: \"0d0385ad-a123-4c46-a96f-652dee1f89cd\") " pod="openstack/glance-db-create-ckgkh" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.843120 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d0385ad-a123-4c46-a96f-652dee1f89cd-operator-scripts\") pod \"glance-db-create-ckgkh\" (UID: \"0d0385ad-a123-4c46-a96f-652dee1f89cd\") " pod="openstack/glance-db-create-ckgkh" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.865451 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8268\" (UniqueName: \"kubernetes.io/projected/0d0385ad-a123-4c46-a96f-652dee1f89cd-kube-api-access-q8268\") pod \"glance-db-create-ckgkh\" (UID: \"0d0385ad-a123-4c46-a96f-652dee1f89cd\") " pod="openstack/glance-db-create-ckgkh" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.865717 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-372a-account-create-update-w4xkf" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.874591 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ckgkh" Jan 21 14:46:56 crc kubenswrapper[4720]: I0121 14:46:56.327395 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-ckgkh"] Jan 21 14:46:56 crc kubenswrapper[4720]: I0121 14:46:56.469940 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-372a-account-create-update-w4xkf"] Jan 21 14:46:56 crc kubenswrapper[4720]: W0121 14:46:56.479948 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4bb55ed_9214_4f25_8740_ac50421baa4b.slice/crio-50f6fc698ccd23a99b1eca76dd8ba0bacc12fceffb9d64c77d0dc600a742ad9b WatchSource:0}: Error finding container 50f6fc698ccd23a99b1eca76dd8ba0bacc12fceffb9d64c77d0dc600a742ad9b: Status 404 returned error can't find the container with id 50f6fc698ccd23a99b1eca76dd8ba0bacc12fceffb9d64c77d0dc600a742ad9b Jan 21 14:46:56 crc kubenswrapper[4720]: I0121 14:46:56.743804 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-372a-account-create-update-w4xkf" event={"ID":"b4bb55ed-9214-4f25-8740-ac50421baa4b","Type":"ContainerStarted","Data":"7a64ef6d780ce73bbcb9b4e47639e6c2751ab6b42a36ab32810d2bb3c4c85044"} Jan 21 14:46:56 crc kubenswrapper[4720]: I0121 14:46:56.744053 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-372a-account-create-update-w4xkf" event={"ID":"b4bb55ed-9214-4f25-8740-ac50421baa4b","Type":"ContainerStarted","Data":"50f6fc698ccd23a99b1eca76dd8ba0bacc12fceffb9d64c77d0dc600a742ad9b"} Jan 21 14:46:56 crc kubenswrapper[4720]: I0121 14:46:56.745770 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ckgkh" event={"ID":"0d0385ad-a123-4c46-a96f-652dee1f89cd","Type":"ContainerStarted","Data":"3b809ce73b12339e4bd569ef93ddc354e0352255b11435e3d7ab7be025d1d6b0"} Jan 21 14:46:56 crc kubenswrapper[4720]: I0121 14:46:56.745829 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ckgkh" event={"ID":"0d0385ad-a123-4c46-a96f-652dee1f89cd","Type":"ContainerStarted","Data":"bcd20e7f6ac82b5dcffae3b544849f7a8247b45941be3c501277dff7c8746f63"} Jan 21 14:46:56 crc kubenswrapper[4720]: I0121 14:46:56.774511 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-ckgkh" podStartSLOduration=1.774278123 podStartE2EDuration="1.774278123s" podCreationTimestamp="2026-01-21 14:46:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:46:56.768757173 +0000 UTC m=+1054.677497175" watchObservedRunningTime="2026-01-21 14:46:56.774278123 +0000 UTC m=+1054.683018095" Jan 21 14:46:57 crc kubenswrapper[4720]: I0121 14:46:57.268049 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-zp68q"] Jan 21 14:46:57 crc kubenswrapper[4720]: I0121 14:46:57.270185 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zp68q" Jan 21 14:46:57 crc kubenswrapper[4720]: I0121 14:46:57.273028 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 21 14:46:57 crc kubenswrapper[4720]: I0121 14:46:57.289392 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-zp68q"] Jan 21 14:46:57 crc kubenswrapper[4720]: I0121 14:46:57.364831 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t5s5\" (UniqueName: \"kubernetes.io/projected/90833a99-00de-45a6-a7c1-4357c6b5f36d-kube-api-access-7t5s5\") pod \"root-account-create-update-zp68q\" (UID: \"90833a99-00de-45a6-a7c1-4357c6b5f36d\") " pod="openstack/root-account-create-update-zp68q" Jan 21 14:46:57 crc kubenswrapper[4720]: I0121 14:46:57.364908 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90833a99-00de-45a6-a7c1-4357c6b5f36d-operator-scripts\") pod \"root-account-create-update-zp68q\" (UID: \"90833a99-00de-45a6-a7c1-4357c6b5f36d\") " pod="openstack/root-account-create-update-zp68q" Jan 21 14:46:57 crc kubenswrapper[4720]: I0121 14:46:57.466197 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90833a99-00de-45a6-a7c1-4357c6b5f36d-operator-scripts\") pod \"root-account-create-update-zp68q\" (UID: \"90833a99-00de-45a6-a7c1-4357c6b5f36d\") " pod="openstack/root-account-create-update-zp68q" Jan 21 14:46:57 crc kubenswrapper[4720]: I0121 14:46:57.466595 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t5s5\" (UniqueName: \"kubernetes.io/projected/90833a99-00de-45a6-a7c1-4357c6b5f36d-kube-api-access-7t5s5\") pod \"root-account-create-update-zp68q\" (UID: \"90833a99-00de-45a6-a7c1-4357c6b5f36d\") " pod="openstack/root-account-create-update-zp68q" Jan 21 14:46:57 crc kubenswrapper[4720]: I0121 14:46:57.467133 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90833a99-00de-45a6-a7c1-4357c6b5f36d-operator-scripts\") pod \"root-account-create-update-zp68q\" (UID: \"90833a99-00de-45a6-a7c1-4357c6b5f36d\") " pod="openstack/root-account-create-update-zp68q" Jan 21 14:46:57 crc kubenswrapper[4720]: I0121 14:46:57.484525 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t5s5\" (UniqueName: \"kubernetes.io/projected/90833a99-00de-45a6-a7c1-4357c6b5f36d-kube-api-access-7t5s5\") pod \"root-account-create-update-zp68q\" (UID: \"90833a99-00de-45a6-a7c1-4357c6b5f36d\") " pod="openstack/root-account-create-update-zp68q" Jan 21 14:46:57 crc kubenswrapper[4720]: I0121 14:46:57.586614 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zp68q" Jan 21 14:46:57 crc kubenswrapper[4720]: I0121 14:46:57.757249 4720 generic.go:334] "Generic (PLEG): container finished" podID="0d0385ad-a123-4c46-a96f-652dee1f89cd" containerID="3b809ce73b12339e4bd569ef93ddc354e0352255b11435e3d7ab7be025d1d6b0" exitCode=0 Jan 21 14:46:57 crc kubenswrapper[4720]: I0121 14:46:57.757963 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ckgkh" event={"ID":"0d0385ad-a123-4c46-a96f-652dee1f89cd","Type":"ContainerDied","Data":"3b809ce73b12339e4bd569ef93ddc354e0352255b11435e3d7ab7be025d1d6b0"} Jan 21 14:46:57 crc kubenswrapper[4720]: I0121 14:46:57.766435 4720 generic.go:334] "Generic (PLEG): container finished" podID="b4bb55ed-9214-4f25-8740-ac50421baa4b" containerID="7a64ef6d780ce73bbcb9b4e47639e6c2751ab6b42a36ab32810d2bb3c4c85044" exitCode=0 Jan 21 14:46:57 crc kubenswrapper[4720]: I0121 14:46:57.766485 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-372a-account-create-update-w4xkf" event={"ID":"b4bb55ed-9214-4f25-8740-ac50421baa4b","Type":"ContainerDied","Data":"7a64ef6d780ce73bbcb9b4e47639e6c2751ab6b42a36ab32810d2bb3c4c85044"} Jan 21 14:46:58 crc kubenswrapper[4720]: I0121 14:46:58.047868 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-zp68q"] Jan 21 14:46:58 crc kubenswrapper[4720]: I0121 14:46:58.149271 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" Jan 21 14:46:58 crc kubenswrapper[4720]: I0121 14:46:58.587880 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:46:58 crc kubenswrapper[4720]: I0121 14:46:58.696229 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-4xc4n"] Jan 21 14:46:58 crc kubenswrapper[4720]: I0121 14:46:58.778261 4720 generic.go:334] "Generic (PLEG): container finished" podID="90833a99-00de-45a6-a7c1-4357c6b5f36d" containerID="cc3e9052ef84997a09ae1c29fb5eed4fd4dc22153bc67325317d7b50498a93b9" exitCode=0 Jan 21 14:46:58 crc kubenswrapper[4720]: I0121 14:46:58.778732 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zp68q" event={"ID":"90833a99-00de-45a6-a7c1-4357c6b5f36d","Type":"ContainerDied","Data":"cc3e9052ef84997a09ae1c29fb5eed4fd4dc22153bc67325317d7b50498a93b9"} Jan 21 14:46:58 crc kubenswrapper[4720]: I0121 14:46:58.778761 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zp68q" event={"ID":"90833a99-00de-45a6-a7c1-4357c6b5f36d","Type":"ContainerStarted","Data":"15f5f2ce53ca4b0e7dd67931d369536ff3991ac75a2ca12f6fce3d89de5a93f4"} Jan 21 14:46:58 crc kubenswrapper[4720]: I0121 14:46:58.778959 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" podUID="92a2976e-a745-4fc4-ae87-355cf6defe5e" containerName="dnsmasq-dns" containerID="cri-o://776dc9c8416f9801617b3484ca9e8cd1cc62e694f35849ce17876a8aa7e06196" gracePeriod=10 Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.230895 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-372a-account-create-update-w4xkf" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.231411 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ckgkh" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.362447 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.413093 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d0385ad-a123-4c46-a96f-652dee1f89cd-operator-scripts\") pod \"0d0385ad-a123-4c46-a96f-652dee1f89cd\" (UID: \"0d0385ad-a123-4c46-a96f-652dee1f89cd\") " Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.413154 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8268\" (UniqueName: \"kubernetes.io/projected/0d0385ad-a123-4c46-a96f-652dee1f89cd-kube-api-access-q8268\") pod \"0d0385ad-a123-4c46-a96f-652dee1f89cd\" (UID: \"0d0385ad-a123-4c46-a96f-652dee1f89cd\") " Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.413388 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2j27h\" (UniqueName: \"kubernetes.io/projected/b4bb55ed-9214-4f25-8740-ac50421baa4b-kube-api-access-2j27h\") pod \"b4bb55ed-9214-4f25-8740-ac50421baa4b\" (UID: \"b4bb55ed-9214-4f25-8740-ac50421baa4b\") " Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.413416 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4bb55ed-9214-4f25-8740-ac50421baa4b-operator-scripts\") pod \"b4bb55ed-9214-4f25-8740-ac50421baa4b\" (UID: \"b4bb55ed-9214-4f25-8740-ac50421baa4b\") " Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.414083 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4bb55ed-9214-4f25-8740-ac50421baa4b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b4bb55ed-9214-4f25-8740-ac50421baa4b" (UID: "b4bb55ed-9214-4f25-8740-ac50421baa4b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.414426 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d0385ad-a123-4c46-a96f-652dee1f89cd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0d0385ad-a123-4c46-a96f-652dee1f89cd" (UID: "0d0385ad-a123-4c46-a96f-652dee1f89cd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.429696 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d0385ad-a123-4c46-a96f-652dee1f89cd-kube-api-access-q8268" (OuterVolumeSpecName: "kube-api-access-q8268") pod "0d0385ad-a123-4c46-a96f-652dee1f89cd" (UID: "0d0385ad-a123-4c46-a96f-652dee1f89cd"). InnerVolumeSpecName "kube-api-access-q8268". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.443985 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4bb55ed-9214-4f25-8740-ac50421baa4b-kube-api-access-2j27h" (OuterVolumeSpecName: "kube-api-access-2j27h") pod "b4bb55ed-9214-4f25-8740-ac50421baa4b" (UID: "b4bb55ed-9214-4f25-8740-ac50421baa4b"). InnerVolumeSpecName "kube-api-access-2j27h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.514767 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92a2976e-a745-4fc4-ae87-355cf6defe5e-dns-svc\") pod \"92a2976e-a745-4fc4-ae87-355cf6defe5e\" (UID: \"92a2976e-a745-4fc4-ae87-355cf6defe5e\") " Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.514840 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99xqg\" (UniqueName: \"kubernetes.io/projected/92a2976e-a745-4fc4-ae87-355cf6defe5e-kube-api-access-99xqg\") pod \"92a2976e-a745-4fc4-ae87-355cf6defe5e\" (UID: \"92a2976e-a745-4fc4-ae87-355cf6defe5e\") " Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.514875 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92a2976e-a745-4fc4-ae87-355cf6defe5e-config\") pod \"92a2976e-a745-4fc4-ae87-355cf6defe5e\" (UID: \"92a2976e-a745-4fc4-ae87-355cf6defe5e\") " Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.514921 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92a2976e-a745-4fc4-ae87-355cf6defe5e-ovsdbserver-sb\") pod \"92a2976e-a745-4fc4-ae87-355cf6defe5e\" (UID: \"92a2976e-a745-4fc4-ae87-355cf6defe5e\") " Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.515422 4720 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d0385ad-a123-4c46-a96f-652dee1f89cd-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.515437 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8268\" (UniqueName: \"kubernetes.io/projected/0d0385ad-a123-4c46-a96f-652dee1f89cd-kube-api-access-q8268\") on node \"crc\" DevicePath \"\"" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.515451 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2j27h\" (UniqueName: \"kubernetes.io/projected/b4bb55ed-9214-4f25-8740-ac50421baa4b-kube-api-access-2j27h\") on node \"crc\" DevicePath \"\"" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.515463 4720 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4bb55ed-9214-4f25-8740-ac50421baa4b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.527453 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92a2976e-a745-4fc4-ae87-355cf6defe5e-kube-api-access-99xqg" (OuterVolumeSpecName: "kube-api-access-99xqg") pod "92a2976e-a745-4fc4-ae87-355cf6defe5e" (UID: "92a2976e-a745-4fc4-ae87-355cf6defe5e"). InnerVolumeSpecName "kube-api-access-99xqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.553113 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92a2976e-a745-4fc4-ae87-355cf6defe5e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "92a2976e-a745-4fc4-ae87-355cf6defe5e" (UID: "92a2976e-a745-4fc4-ae87-355cf6defe5e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.557262 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92a2976e-a745-4fc4-ae87-355cf6defe5e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "92a2976e-a745-4fc4-ae87-355cf6defe5e" (UID: "92a2976e-a745-4fc4-ae87-355cf6defe5e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.567582 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92a2976e-a745-4fc4-ae87-355cf6defe5e-config" (OuterVolumeSpecName: "config") pod "92a2976e-a745-4fc4-ae87-355cf6defe5e" (UID: "92a2976e-a745-4fc4-ae87-355cf6defe5e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.617430 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92a2976e-a745-4fc4-ae87-355cf6defe5e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.617473 4720 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92a2976e-a745-4fc4-ae87-355cf6defe5e-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.617485 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99xqg\" (UniqueName: \"kubernetes.io/projected/92a2976e-a745-4fc4-ae87-355cf6defe5e-kube-api-access-99xqg\") on node \"crc\" DevicePath \"\"" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.617497 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92a2976e-a745-4fc4-ae87-355cf6defe5e-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.788596 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ckgkh" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.788600 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ckgkh" event={"ID":"0d0385ad-a123-4c46-a96f-652dee1f89cd","Type":"ContainerDied","Data":"bcd20e7f6ac82b5dcffae3b544849f7a8247b45941be3c501277dff7c8746f63"} Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.788672 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcd20e7f6ac82b5dcffae3b544849f7a8247b45941be3c501277dff7c8746f63" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.790735 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-372a-account-create-update-w4xkf" event={"ID":"b4bb55ed-9214-4f25-8740-ac50421baa4b","Type":"ContainerDied","Data":"50f6fc698ccd23a99b1eca76dd8ba0bacc12fceffb9d64c77d0dc600a742ad9b"} Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.790762 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50f6fc698ccd23a99b1eca76dd8ba0bacc12fceffb9d64c77d0dc600a742ad9b" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.790808 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-372a-account-create-update-w4xkf" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.793721 4720 generic.go:334] "Generic (PLEG): container finished" podID="92a2976e-a745-4fc4-ae87-355cf6defe5e" containerID="776dc9c8416f9801617b3484ca9e8cd1cc62e694f35849ce17876a8aa7e06196" exitCode=0 Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.793898 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.793903 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" event={"ID":"92a2976e-a745-4fc4-ae87-355cf6defe5e","Type":"ContainerDied","Data":"776dc9c8416f9801617b3484ca9e8cd1cc62e694f35849ce17876a8aa7e06196"} Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.794114 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" event={"ID":"92a2976e-a745-4fc4-ae87-355cf6defe5e","Type":"ContainerDied","Data":"5c2c6c7764adf382494a9f5fec6ff894d74ac4a2178d6e2c114761cba32aec98"} Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.794137 4720 scope.go:117] "RemoveContainer" containerID="776dc9c8416f9801617b3484ca9e8cd1cc62e694f35849ce17876a8aa7e06196" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.830724 4720 scope.go:117] "RemoveContainer" containerID="6f7786d2264d71ca7c2f10284d3bfafeed9e1702b1b9a1588ef1bbc0db72c906" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.845500 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-mcz8g"] Jan 21 14:46:59 crc kubenswrapper[4720]: E0121 14:46:59.845941 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4bb55ed-9214-4f25-8740-ac50421baa4b" containerName="mariadb-account-create-update" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.845959 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4bb55ed-9214-4f25-8740-ac50421baa4b" containerName="mariadb-account-create-update" Jan 21 14:46:59 crc kubenswrapper[4720]: E0121 14:46:59.845998 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92a2976e-a745-4fc4-ae87-355cf6defe5e" containerName="dnsmasq-dns" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.846006 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a2976e-a745-4fc4-ae87-355cf6defe5e" containerName="dnsmasq-dns" Jan 21 14:46:59 crc kubenswrapper[4720]: E0121 14:46:59.846022 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92a2976e-a745-4fc4-ae87-355cf6defe5e" containerName="init" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.846030 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a2976e-a745-4fc4-ae87-355cf6defe5e" containerName="init" Jan 21 14:46:59 crc kubenswrapper[4720]: E0121 14:46:59.846048 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d0385ad-a123-4c46-a96f-652dee1f89cd" containerName="mariadb-database-create" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.846056 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d0385ad-a123-4c46-a96f-652dee1f89cd" containerName="mariadb-database-create" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.846287 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d0385ad-a123-4c46-a96f-652dee1f89cd" containerName="mariadb-database-create" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.846299 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4bb55ed-9214-4f25-8740-ac50421baa4b" containerName="mariadb-account-create-update" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.846314 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="92a2976e-a745-4fc4-ae87-355cf6defe5e" containerName="dnsmasq-dns" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.846840 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mcz8g" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.852741 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-4xc4n"] Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.862137 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-4xc4n"] Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.862822 4720 scope.go:117] "RemoveContainer" containerID="776dc9c8416f9801617b3484ca9e8cd1cc62e694f35849ce17876a8aa7e06196" Jan 21 14:46:59 crc kubenswrapper[4720]: E0121 14:46:59.865404 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"776dc9c8416f9801617b3484ca9e8cd1cc62e694f35849ce17876a8aa7e06196\": container with ID starting with 776dc9c8416f9801617b3484ca9e8cd1cc62e694f35849ce17876a8aa7e06196 not found: ID does not exist" containerID="776dc9c8416f9801617b3484ca9e8cd1cc62e694f35849ce17876a8aa7e06196" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.865523 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"776dc9c8416f9801617b3484ca9e8cd1cc62e694f35849ce17876a8aa7e06196"} err="failed to get container status \"776dc9c8416f9801617b3484ca9e8cd1cc62e694f35849ce17876a8aa7e06196\": rpc error: code = NotFound desc = could not find container \"776dc9c8416f9801617b3484ca9e8cd1cc62e694f35849ce17876a8aa7e06196\": container with ID starting with 776dc9c8416f9801617b3484ca9e8cd1cc62e694f35849ce17876a8aa7e06196 not found: ID does not exist" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.865613 4720 scope.go:117] "RemoveContainer" containerID="6f7786d2264d71ca7c2f10284d3bfafeed9e1702b1b9a1588ef1bbc0db72c906" Jan 21 14:46:59 crc kubenswrapper[4720]: E0121 14:46:59.865995 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f7786d2264d71ca7c2f10284d3bfafeed9e1702b1b9a1588ef1bbc0db72c906\": container with ID starting with 6f7786d2264d71ca7c2f10284d3bfafeed9e1702b1b9a1588ef1bbc0db72c906 not found: ID does not exist" containerID="6f7786d2264d71ca7c2f10284d3bfafeed9e1702b1b9a1588ef1bbc0db72c906" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.866017 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f7786d2264d71ca7c2f10284d3bfafeed9e1702b1b9a1588ef1bbc0db72c906"} err="failed to get container status \"6f7786d2264d71ca7c2f10284d3bfafeed9e1702b1b9a1588ef1bbc0db72c906\": rpc error: code = NotFound desc = could not find container \"6f7786d2264d71ca7c2f10284d3bfafeed9e1702b1b9a1588ef1bbc0db72c906\": container with ID starting with 6f7786d2264d71ca7c2f10284d3bfafeed9e1702b1b9a1588ef1bbc0db72c906 not found: ID does not exist" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.872822 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-mcz8g"] Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.956647 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-06a3-account-create-update-dbk66"] Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.957568 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-06a3-account-create-update-dbk66" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.966106 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.968838 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-06a3-account-create-update-dbk66"] Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.023919 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/290dffa3-ed33-4571-aeb1-092aae1d8105-operator-scripts\") pod \"keystone-db-create-mcz8g\" (UID: \"290dffa3-ed33-4571-aeb1-092aae1d8105\") " pod="openstack/keystone-db-create-mcz8g" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.024153 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bnwh\" (UniqueName: \"kubernetes.io/projected/290dffa3-ed33-4571-aeb1-092aae1d8105-kube-api-access-8bnwh\") pod \"keystone-db-create-mcz8g\" (UID: \"290dffa3-ed33-4571-aeb1-092aae1d8105\") " pod="openstack/keystone-db-create-mcz8g" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.099389 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zp68q" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.125887 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8161ded5-d8ab-48b7-9c1a-16a7155641d1-operator-scripts\") pod \"keystone-06a3-account-create-update-dbk66\" (UID: \"8161ded5-d8ab-48b7-9c1a-16a7155641d1\") " pod="openstack/keystone-06a3-account-create-update-dbk66" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.126208 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/290dffa3-ed33-4571-aeb1-092aae1d8105-operator-scripts\") pod \"keystone-db-create-mcz8g\" (UID: \"290dffa3-ed33-4571-aeb1-092aae1d8105\") " pod="openstack/keystone-db-create-mcz8g" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.126866 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/290dffa3-ed33-4571-aeb1-092aae1d8105-operator-scripts\") pod \"keystone-db-create-mcz8g\" (UID: \"290dffa3-ed33-4571-aeb1-092aae1d8105\") " pod="openstack/keystone-db-create-mcz8g" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.126929 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llz2d\" (UniqueName: \"kubernetes.io/projected/8161ded5-d8ab-48b7-9c1a-16a7155641d1-kube-api-access-llz2d\") pod \"keystone-06a3-account-create-update-dbk66\" (UID: \"8161ded5-d8ab-48b7-9c1a-16a7155641d1\") " pod="openstack/keystone-06a3-account-create-update-dbk66" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.127179 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bnwh\" (UniqueName: \"kubernetes.io/projected/290dffa3-ed33-4571-aeb1-092aae1d8105-kube-api-access-8bnwh\") pod \"keystone-db-create-mcz8g\" (UID: \"290dffa3-ed33-4571-aeb1-092aae1d8105\") " pod="openstack/keystone-db-create-mcz8g" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.142026 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-55njq"] Jan 21 14:47:00 crc kubenswrapper[4720]: E0121 14:47:00.142387 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90833a99-00de-45a6-a7c1-4357c6b5f36d" containerName="mariadb-account-create-update" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.142404 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="90833a99-00de-45a6-a7c1-4357c6b5f36d" containerName="mariadb-account-create-update" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.142784 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="90833a99-00de-45a6-a7c1-4357c6b5f36d" containerName="mariadb-account-create-update" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.143284 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-55njq" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.149985 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-55njq"] Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.157992 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bnwh\" (UniqueName: \"kubernetes.io/projected/290dffa3-ed33-4571-aeb1-092aae1d8105-kube-api-access-8bnwh\") pod \"keystone-db-create-mcz8g\" (UID: \"290dffa3-ed33-4571-aeb1-092aae1d8105\") " pod="openstack/keystone-db-create-mcz8g" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.173225 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mcz8g" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.228912 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7t5s5\" (UniqueName: \"kubernetes.io/projected/90833a99-00de-45a6-a7c1-4357c6b5f36d-kube-api-access-7t5s5\") pod \"90833a99-00de-45a6-a7c1-4357c6b5f36d\" (UID: \"90833a99-00de-45a6-a7c1-4357c6b5f36d\") " Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.229165 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90833a99-00de-45a6-a7c1-4357c6b5f36d-operator-scripts\") pod \"90833a99-00de-45a6-a7c1-4357c6b5f36d\" (UID: \"90833a99-00de-45a6-a7c1-4357c6b5f36d\") " Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.229421 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llz2d\" (UniqueName: \"kubernetes.io/projected/8161ded5-d8ab-48b7-9c1a-16a7155641d1-kube-api-access-llz2d\") pod \"keystone-06a3-account-create-update-dbk66\" (UID: \"8161ded5-d8ab-48b7-9c1a-16a7155641d1\") " pod="openstack/keystone-06a3-account-create-update-dbk66" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.229521 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49f0b95b-6621-43fe-93c2-d4e7704f1f61-operator-scripts\") pod \"placement-db-create-55njq\" (UID: \"49f0b95b-6621-43fe-93c2-d4e7704f1f61\") " pod="openstack/placement-db-create-55njq" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.229560 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x55hs\" (UniqueName: \"kubernetes.io/projected/49f0b95b-6621-43fe-93c2-d4e7704f1f61-kube-api-access-x55hs\") pod \"placement-db-create-55njq\" (UID: \"49f0b95b-6621-43fe-93c2-d4e7704f1f61\") " pod="openstack/placement-db-create-55njq" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.229602 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8161ded5-d8ab-48b7-9c1a-16a7155641d1-operator-scripts\") pod \"keystone-06a3-account-create-update-dbk66\" (UID: \"8161ded5-d8ab-48b7-9c1a-16a7155641d1\") " pod="openstack/keystone-06a3-account-create-update-dbk66" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.231456 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8161ded5-d8ab-48b7-9c1a-16a7155641d1-operator-scripts\") pod \"keystone-06a3-account-create-update-dbk66\" (UID: \"8161ded5-d8ab-48b7-9c1a-16a7155641d1\") " pod="openstack/keystone-06a3-account-create-update-dbk66" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.232542 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90833a99-00de-45a6-a7c1-4357c6b5f36d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "90833a99-00de-45a6-a7c1-4357c6b5f36d" (UID: "90833a99-00de-45a6-a7c1-4357c6b5f36d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.239429 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90833a99-00de-45a6-a7c1-4357c6b5f36d-kube-api-access-7t5s5" (OuterVolumeSpecName: "kube-api-access-7t5s5") pod "90833a99-00de-45a6-a7c1-4357c6b5f36d" (UID: "90833a99-00de-45a6-a7c1-4357c6b5f36d"). InnerVolumeSpecName "kube-api-access-7t5s5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.262826 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llz2d\" (UniqueName: \"kubernetes.io/projected/8161ded5-d8ab-48b7-9c1a-16a7155641d1-kube-api-access-llz2d\") pod \"keystone-06a3-account-create-update-dbk66\" (UID: \"8161ded5-d8ab-48b7-9c1a-16a7155641d1\") " pod="openstack/keystone-06a3-account-create-update-dbk66" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.290928 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-06a3-account-create-update-dbk66" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.332128 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x55hs\" (UniqueName: \"kubernetes.io/projected/49f0b95b-6621-43fe-93c2-d4e7704f1f61-kube-api-access-x55hs\") pod \"placement-db-create-55njq\" (UID: \"49f0b95b-6621-43fe-93c2-d4e7704f1f61\") " pod="openstack/placement-db-create-55njq" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.332474 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49f0b95b-6621-43fe-93c2-d4e7704f1f61-operator-scripts\") pod \"placement-db-create-55njq\" (UID: \"49f0b95b-6621-43fe-93c2-d4e7704f1f61\") " pod="openstack/placement-db-create-55njq" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.332541 4720 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90833a99-00de-45a6-a7c1-4357c6b5f36d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.332553 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7t5s5\" (UniqueName: \"kubernetes.io/projected/90833a99-00de-45a6-a7c1-4357c6b5f36d-kube-api-access-7t5s5\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.334335 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49f0b95b-6621-43fe-93c2-d4e7704f1f61-operator-scripts\") pod \"placement-db-create-55njq\" (UID: \"49f0b95b-6621-43fe-93c2-d4e7704f1f61\") " pod="openstack/placement-db-create-55njq" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.353710 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x55hs\" (UniqueName: \"kubernetes.io/projected/49f0b95b-6621-43fe-93c2-d4e7704f1f61-kube-api-access-x55hs\") pod \"placement-db-create-55njq\" (UID: \"49f0b95b-6621-43fe-93c2-d4e7704f1f61\") " pod="openstack/placement-db-create-55njq" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.353759 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-318a-account-create-update-lkf6p"] Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.354758 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-318a-account-create-update-lkf6p" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.359312 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.366960 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-318a-account-create-update-lkf6p"] Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.434456 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4fbe0fa-0158-480f-9f6d-2d589da3b91e-operator-scripts\") pod \"placement-318a-account-create-update-lkf6p\" (UID: \"a4fbe0fa-0158-480f-9f6d-2d589da3b91e\") " pod="openstack/placement-318a-account-create-update-lkf6p" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.434642 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fggzn\" (UniqueName: \"kubernetes.io/projected/a4fbe0fa-0158-480f-9f6d-2d589da3b91e-kube-api-access-fggzn\") pod \"placement-318a-account-create-update-lkf6p\" (UID: \"a4fbe0fa-0158-480f-9f6d-2d589da3b91e\") " pod="openstack/placement-318a-account-create-update-lkf6p" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.536013 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fggzn\" (UniqueName: \"kubernetes.io/projected/a4fbe0fa-0158-480f-9f6d-2d589da3b91e-kube-api-access-fggzn\") pod \"placement-318a-account-create-update-lkf6p\" (UID: \"a4fbe0fa-0158-480f-9f6d-2d589da3b91e\") " pod="openstack/placement-318a-account-create-update-lkf6p" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.536097 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4fbe0fa-0158-480f-9f6d-2d589da3b91e-operator-scripts\") pod \"placement-318a-account-create-update-lkf6p\" (UID: \"a4fbe0fa-0158-480f-9f6d-2d589da3b91e\") " pod="openstack/placement-318a-account-create-update-lkf6p" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.536907 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4fbe0fa-0158-480f-9f6d-2d589da3b91e-operator-scripts\") pod \"placement-318a-account-create-update-lkf6p\" (UID: \"a4fbe0fa-0158-480f-9f6d-2d589da3b91e\") " pod="openstack/placement-318a-account-create-update-lkf6p" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.563151 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fggzn\" (UniqueName: \"kubernetes.io/projected/a4fbe0fa-0158-480f-9f6d-2d589da3b91e-kube-api-access-fggzn\") pod \"placement-318a-account-create-update-lkf6p\" (UID: \"a4fbe0fa-0158-480f-9f6d-2d589da3b91e\") " pod="openstack/placement-318a-account-create-update-lkf6p" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.611102 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-dtj5w"] Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.612167 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-dtj5w" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.616260 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-dfmqw" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.618168 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.627935 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-55njq" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.646790 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-dtj5w"] Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.682274 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-318a-account-create-update-lkf6p" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.697232 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92a2976e-a745-4fc4-ae87-355cf6defe5e" path="/var/lib/kubelet/pods/92a2976e-a745-4fc4-ae87-355cf6defe5e/volumes" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.698297 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-mcz8g"] Jan 21 14:47:00 crc kubenswrapper[4720]: W0121 14:47:00.700594 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod290dffa3_ed33_4571_aeb1_092aae1d8105.slice/crio-1b0eaed9ebf8e37b7d7113aa66da4e4a8e47a0b85be9622d3fbe34dfdf3209e3 WatchSource:0}: Error finding container 1b0eaed9ebf8e37b7d7113aa66da4e4a8e47a0b85be9622d3fbe34dfdf3209e3: Status 404 returned error can't find the container with id 1b0eaed9ebf8e37b7d7113aa66da4e4a8e47a0b85be9622d3fbe34dfdf3209e3 Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.748290 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c40c650e-a05e-4cc0-88fa-d56eae92d29a-config-data\") pod \"glance-db-sync-dtj5w\" (UID: \"c40c650e-a05e-4cc0-88fa-d56eae92d29a\") " pod="openstack/glance-db-sync-dtj5w" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.748606 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c40c650e-a05e-4cc0-88fa-d56eae92d29a-db-sync-config-data\") pod \"glance-db-sync-dtj5w\" (UID: \"c40c650e-a05e-4cc0-88fa-d56eae92d29a\") " pod="openstack/glance-db-sync-dtj5w" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.748735 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c40c650e-a05e-4cc0-88fa-d56eae92d29a-combined-ca-bundle\") pod \"glance-db-sync-dtj5w\" (UID: \"c40c650e-a05e-4cc0-88fa-d56eae92d29a\") " pod="openstack/glance-db-sync-dtj5w" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.748770 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m45w\" (UniqueName: \"kubernetes.io/projected/c40c650e-a05e-4cc0-88fa-d56eae92d29a-kube-api-access-6m45w\") pod \"glance-db-sync-dtj5w\" (UID: \"c40c650e-a05e-4cc0-88fa-d56eae92d29a\") " pod="openstack/glance-db-sync-dtj5w" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.811350 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mcz8g" event={"ID":"290dffa3-ed33-4571-aeb1-092aae1d8105","Type":"ContainerStarted","Data":"1b0eaed9ebf8e37b7d7113aa66da4e4a8e47a0b85be9622d3fbe34dfdf3209e3"} Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.814102 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zp68q" event={"ID":"90833a99-00de-45a6-a7c1-4357c6b5f36d","Type":"ContainerDied","Data":"15f5f2ce53ca4b0e7dd67931d369536ff3991ac75a2ca12f6fce3d89de5a93f4"} Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.814120 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15f5f2ce53ca4b0e7dd67931d369536ff3991ac75a2ca12f6fce3d89de5a93f4" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.814184 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zp68q" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.851105 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c40c650e-a05e-4cc0-88fa-d56eae92d29a-config-data\") pod \"glance-db-sync-dtj5w\" (UID: \"c40c650e-a05e-4cc0-88fa-d56eae92d29a\") " pod="openstack/glance-db-sync-dtj5w" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.851154 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c40c650e-a05e-4cc0-88fa-d56eae92d29a-db-sync-config-data\") pod \"glance-db-sync-dtj5w\" (UID: \"c40c650e-a05e-4cc0-88fa-d56eae92d29a\") " pod="openstack/glance-db-sync-dtj5w" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.851273 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c40c650e-a05e-4cc0-88fa-d56eae92d29a-combined-ca-bundle\") pod \"glance-db-sync-dtj5w\" (UID: \"c40c650e-a05e-4cc0-88fa-d56eae92d29a\") " pod="openstack/glance-db-sync-dtj5w" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.851322 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6m45w\" (UniqueName: \"kubernetes.io/projected/c40c650e-a05e-4cc0-88fa-d56eae92d29a-kube-api-access-6m45w\") pod \"glance-db-sync-dtj5w\" (UID: \"c40c650e-a05e-4cc0-88fa-d56eae92d29a\") " pod="openstack/glance-db-sync-dtj5w" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.856634 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c40c650e-a05e-4cc0-88fa-d56eae92d29a-db-sync-config-data\") pod \"glance-db-sync-dtj5w\" (UID: \"c40c650e-a05e-4cc0-88fa-d56eae92d29a\") " pod="openstack/glance-db-sync-dtj5w" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.858467 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c40c650e-a05e-4cc0-88fa-d56eae92d29a-config-data\") pod \"glance-db-sync-dtj5w\" (UID: \"c40c650e-a05e-4cc0-88fa-d56eae92d29a\") " pod="openstack/glance-db-sync-dtj5w" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.861920 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-06a3-account-create-update-dbk66"] Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.859705 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c40c650e-a05e-4cc0-88fa-d56eae92d29a-combined-ca-bundle\") pod \"glance-db-sync-dtj5w\" (UID: \"c40c650e-a05e-4cc0-88fa-d56eae92d29a\") " pod="openstack/glance-db-sync-dtj5w" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.868923 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m45w\" (UniqueName: \"kubernetes.io/projected/c40c650e-a05e-4cc0-88fa-d56eae92d29a-kube-api-access-6m45w\") pod \"glance-db-sync-dtj5w\" (UID: \"c40c650e-a05e-4cc0-88fa-d56eae92d29a\") " pod="openstack/glance-db-sync-dtj5w" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.936011 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-dtj5w" Jan 21 14:47:01 crc kubenswrapper[4720]: I0121 14:47:01.089995 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-55njq"] Jan 21 14:47:01 crc kubenswrapper[4720]: W0121 14:47:01.090959 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49f0b95b_6621_43fe_93c2_d4e7704f1f61.slice/crio-3f52311ac4be9191b50d313b0610f5f223eaf86ba39fcbe30e3eaa05a9f82d06 WatchSource:0}: Error finding container 3f52311ac4be9191b50d313b0610f5f223eaf86ba39fcbe30e3eaa05a9f82d06: Status 404 returned error can't find the container with id 3f52311ac4be9191b50d313b0610f5f223eaf86ba39fcbe30e3eaa05a9f82d06 Jan 21 14:47:01 crc kubenswrapper[4720]: I0121 14:47:01.190388 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-318a-account-create-update-lkf6p"] Jan 21 14:47:01 crc kubenswrapper[4720]: I0121 14:47:01.555813 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-dtj5w"] Jan 21 14:47:01 crc kubenswrapper[4720]: I0121 14:47:01.824894 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-318a-account-create-update-lkf6p" event={"ID":"a4fbe0fa-0158-480f-9f6d-2d589da3b91e","Type":"ContainerStarted","Data":"dae0e28936bcc6f5956c6eab724975a72ae35869b387709c9280dc4e17738181"} Jan 21 14:47:01 crc kubenswrapper[4720]: I0121 14:47:01.824943 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-318a-account-create-update-lkf6p" event={"ID":"a4fbe0fa-0158-480f-9f6d-2d589da3b91e","Type":"ContainerStarted","Data":"2109b20b6f854145c58fbae0d14383fbd564d133a8afbb9af7549a06fd795e90"} Jan 21 14:47:01 crc kubenswrapper[4720]: I0121 14:47:01.826183 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-dtj5w" event={"ID":"c40c650e-a05e-4cc0-88fa-d56eae92d29a","Type":"ContainerStarted","Data":"cf896b0a6c15a078a1d5ea17910a17801f89b537d091ad548be61dbf899f72a0"} Jan 21 14:47:01 crc kubenswrapper[4720]: I0121 14:47:01.827934 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-06a3-account-create-update-dbk66" event={"ID":"8161ded5-d8ab-48b7-9c1a-16a7155641d1","Type":"ContainerStarted","Data":"41516602ff1ad171062abf2d068bab3f3ef63d954e1d46d8ab67f0a5722b61e9"} Jan 21 14:47:01 crc kubenswrapper[4720]: I0121 14:47:01.827957 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-06a3-account-create-update-dbk66" event={"ID":"8161ded5-d8ab-48b7-9c1a-16a7155641d1","Type":"ContainerStarted","Data":"49db172c874956edacb1f12fe4161e695fa0df2db97f11fd7c0e9120811d6732"} Jan 21 14:47:01 crc kubenswrapper[4720]: I0121 14:47:01.830044 4720 generic.go:334] "Generic (PLEG): container finished" podID="290dffa3-ed33-4571-aeb1-092aae1d8105" containerID="93fd560224a5890696cb0b97a0caeb546a3a0f6e334fb8c0f1cfda08ff3cdbe7" exitCode=0 Jan 21 14:47:01 crc kubenswrapper[4720]: I0121 14:47:01.830086 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mcz8g" event={"ID":"290dffa3-ed33-4571-aeb1-092aae1d8105","Type":"ContainerDied","Data":"93fd560224a5890696cb0b97a0caeb546a3a0f6e334fb8c0f1cfda08ff3cdbe7"} Jan 21 14:47:01 crc kubenswrapper[4720]: I0121 14:47:01.831133 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-55njq" event={"ID":"49f0b95b-6621-43fe-93c2-d4e7704f1f61","Type":"ContainerStarted","Data":"3f52311ac4be9191b50d313b0610f5f223eaf86ba39fcbe30e3eaa05a9f82d06"} Jan 21 14:47:01 crc kubenswrapper[4720]: I0121 14:47:01.847880 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-06a3-account-create-update-dbk66" podStartSLOduration=2.847851354 podStartE2EDuration="2.847851354s" podCreationTimestamp="2026-01-21 14:46:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:47:01.846674391 +0000 UTC m=+1059.755414333" watchObservedRunningTime="2026-01-21 14:47:01.847851354 +0000 UTC m=+1059.756591296" Jan 21 14:47:02 crc kubenswrapper[4720]: I0121 14:47:02.839378 4720 generic.go:334] "Generic (PLEG): container finished" podID="49f0b95b-6621-43fe-93c2-d4e7704f1f61" containerID="abbb759ffaf221d0c9f8ed807f7987c4931c0626f086cc661e603dcc248f4947" exitCode=0 Jan 21 14:47:02 crc kubenswrapper[4720]: I0121 14:47:02.839478 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-55njq" event={"ID":"49f0b95b-6621-43fe-93c2-d4e7704f1f61","Type":"ContainerDied","Data":"abbb759ffaf221d0c9f8ed807f7987c4931c0626f086cc661e603dcc248f4947"} Jan 21 14:47:02 crc kubenswrapper[4720]: I0121 14:47:02.895907 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-318a-account-create-update-lkf6p" podStartSLOduration=2.895888711 podStartE2EDuration="2.895888711s" podCreationTimestamp="2026-01-21 14:47:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:47:02.892450307 +0000 UTC m=+1060.801190239" watchObservedRunningTime="2026-01-21 14:47:02.895888711 +0000 UTC m=+1060.804628653" Jan 21 14:47:03 crc kubenswrapper[4720]: I0121 14:47:03.152935 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mcz8g" Jan 21 14:47:03 crc kubenswrapper[4720]: I0121 14:47:03.291039 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bnwh\" (UniqueName: \"kubernetes.io/projected/290dffa3-ed33-4571-aeb1-092aae1d8105-kube-api-access-8bnwh\") pod \"290dffa3-ed33-4571-aeb1-092aae1d8105\" (UID: \"290dffa3-ed33-4571-aeb1-092aae1d8105\") " Jan 21 14:47:03 crc kubenswrapper[4720]: I0121 14:47:03.291115 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/290dffa3-ed33-4571-aeb1-092aae1d8105-operator-scripts\") pod \"290dffa3-ed33-4571-aeb1-092aae1d8105\" (UID: \"290dffa3-ed33-4571-aeb1-092aae1d8105\") " Jan 21 14:47:03 crc kubenswrapper[4720]: I0121 14:47:03.292064 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/290dffa3-ed33-4571-aeb1-092aae1d8105-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "290dffa3-ed33-4571-aeb1-092aae1d8105" (UID: "290dffa3-ed33-4571-aeb1-092aae1d8105"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:03 crc kubenswrapper[4720]: I0121 14:47:03.297363 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/290dffa3-ed33-4571-aeb1-092aae1d8105-kube-api-access-8bnwh" (OuterVolumeSpecName: "kube-api-access-8bnwh") pod "290dffa3-ed33-4571-aeb1-092aae1d8105" (UID: "290dffa3-ed33-4571-aeb1-092aae1d8105"). InnerVolumeSpecName "kube-api-access-8bnwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:47:03 crc kubenswrapper[4720]: I0121 14:47:03.393317 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bnwh\" (UniqueName: \"kubernetes.io/projected/290dffa3-ed33-4571-aeb1-092aae1d8105-kube-api-access-8bnwh\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:03 crc kubenswrapper[4720]: I0121 14:47:03.393339 4720 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/290dffa3-ed33-4571-aeb1-092aae1d8105-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:03 crc kubenswrapper[4720]: I0121 14:47:03.556401 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-zp68q"] Jan 21 14:47:03 crc kubenswrapper[4720]: I0121 14:47:03.565528 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-zp68q"] Jan 21 14:47:03 crc kubenswrapper[4720]: I0121 14:47:03.856290 4720 generic.go:334] "Generic (PLEG): container finished" podID="8161ded5-d8ab-48b7-9c1a-16a7155641d1" containerID="41516602ff1ad171062abf2d068bab3f3ef63d954e1d46d8ab67f0a5722b61e9" exitCode=0 Jan 21 14:47:03 crc kubenswrapper[4720]: I0121 14:47:03.856482 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-06a3-account-create-update-dbk66" event={"ID":"8161ded5-d8ab-48b7-9c1a-16a7155641d1","Type":"ContainerDied","Data":"41516602ff1ad171062abf2d068bab3f3ef63d954e1d46d8ab67f0a5722b61e9"} Jan 21 14:47:03 crc kubenswrapper[4720]: I0121 14:47:03.861213 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mcz8g" event={"ID":"290dffa3-ed33-4571-aeb1-092aae1d8105","Type":"ContainerDied","Data":"1b0eaed9ebf8e37b7d7113aa66da4e4a8e47a0b85be9622d3fbe34dfdf3209e3"} Jan 21 14:47:03 crc kubenswrapper[4720]: I0121 14:47:03.861256 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b0eaed9ebf8e37b7d7113aa66da4e4a8e47a0b85be9622d3fbe34dfdf3209e3" Jan 21 14:47:03 crc kubenswrapper[4720]: I0121 14:47:03.861236 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mcz8g" Jan 21 14:47:03 crc kubenswrapper[4720]: I0121 14:47:03.862672 4720 generic.go:334] "Generic (PLEG): container finished" podID="a4fbe0fa-0158-480f-9f6d-2d589da3b91e" containerID="dae0e28936bcc6f5956c6eab724975a72ae35869b387709c9280dc4e17738181" exitCode=0 Jan 21 14:47:03 crc kubenswrapper[4720]: I0121 14:47:03.862807 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-318a-account-create-update-lkf6p" event={"ID":"a4fbe0fa-0158-480f-9f6d-2d589da3b91e","Type":"ContainerDied","Data":"dae0e28936bcc6f5956c6eab724975a72ae35869b387709c9280dc4e17738181"} Jan 21 14:47:04 crc kubenswrapper[4720]: I0121 14:47:04.176396 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-55njq" Jan 21 14:47:04 crc kubenswrapper[4720]: I0121 14:47:04.309019 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49f0b95b-6621-43fe-93c2-d4e7704f1f61-operator-scripts\") pod \"49f0b95b-6621-43fe-93c2-d4e7704f1f61\" (UID: \"49f0b95b-6621-43fe-93c2-d4e7704f1f61\") " Jan 21 14:47:04 crc kubenswrapper[4720]: I0121 14:47:04.309102 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x55hs\" (UniqueName: \"kubernetes.io/projected/49f0b95b-6621-43fe-93c2-d4e7704f1f61-kube-api-access-x55hs\") pod \"49f0b95b-6621-43fe-93c2-d4e7704f1f61\" (UID: \"49f0b95b-6621-43fe-93c2-d4e7704f1f61\") " Jan 21 14:47:04 crc kubenswrapper[4720]: I0121 14:47:04.309584 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49f0b95b-6621-43fe-93c2-d4e7704f1f61-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "49f0b95b-6621-43fe-93c2-d4e7704f1f61" (UID: "49f0b95b-6621-43fe-93c2-d4e7704f1f61"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:04 crc kubenswrapper[4720]: I0121 14:47:04.309693 4720 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49f0b95b-6621-43fe-93c2-d4e7704f1f61-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:04 crc kubenswrapper[4720]: I0121 14:47:04.316209 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49f0b95b-6621-43fe-93c2-d4e7704f1f61-kube-api-access-x55hs" (OuterVolumeSpecName: "kube-api-access-x55hs") pod "49f0b95b-6621-43fe-93c2-d4e7704f1f61" (UID: "49f0b95b-6621-43fe-93c2-d4e7704f1f61"). InnerVolumeSpecName "kube-api-access-x55hs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:47:04 crc kubenswrapper[4720]: I0121 14:47:04.412020 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x55hs\" (UniqueName: \"kubernetes.io/projected/49f0b95b-6621-43fe-93c2-d4e7704f1f61-kube-api-access-x55hs\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:04 crc kubenswrapper[4720]: I0121 14:47:04.687499 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90833a99-00de-45a6-a7c1-4357c6b5f36d" path="/var/lib/kubelet/pods/90833a99-00de-45a6-a7c1-4357c6b5f36d/volumes" Jan 21 14:47:04 crc kubenswrapper[4720]: I0121 14:47:04.870369 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-55njq" Jan 21 14:47:04 crc kubenswrapper[4720]: I0121 14:47:04.870966 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-55njq" event={"ID":"49f0b95b-6621-43fe-93c2-d4e7704f1f61","Type":"ContainerDied","Data":"3f52311ac4be9191b50d313b0610f5f223eaf86ba39fcbe30e3eaa05a9f82d06"} Jan 21 14:47:04 crc kubenswrapper[4720]: I0121 14:47:04.870995 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f52311ac4be9191b50d313b0610f5f223eaf86ba39fcbe30e3eaa05a9f82d06" Jan 21 14:47:05 crc kubenswrapper[4720]: I0121 14:47:05.074323 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Jan 21 14:47:05 crc kubenswrapper[4720]: I0121 14:47:05.276971 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-318a-account-create-update-lkf6p" Jan 21 14:47:05 crc kubenswrapper[4720]: I0121 14:47:05.340227 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-06a3-account-create-update-dbk66" Jan 21 14:47:05 crc kubenswrapper[4720]: I0121 14:47:05.430909 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llz2d\" (UniqueName: \"kubernetes.io/projected/8161ded5-d8ab-48b7-9c1a-16a7155641d1-kube-api-access-llz2d\") pod \"8161ded5-d8ab-48b7-9c1a-16a7155641d1\" (UID: \"8161ded5-d8ab-48b7-9c1a-16a7155641d1\") " Jan 21 14:47:05 crc kubenswrapper[4720]: I0121 14:47:05.430983 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4fbe0fa-0158-480f-9f6d-2d589da3b91e-operator-scripts\") pod \"a4fbe0fa-0158-480f-9f6d-2d589da3b91e\" (UID: \"a4fbe0fa-0158-480f-9f6d-2d589da3b91e\") " Jan 21 14:47:05 crc kubenswrapper[4720]: I0121 14:47:05.431079 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fggzn\" (UniqueName: \"kubernetes.io/projected/a4fbe0fa-0158-480f-9f6d-2d589da3b91e-kube-api-access-fggzn\") pod \"a4fbe0fa-0158-480f-9f6d-2d589da3b91e\" (UID: \"a4fbe0fa-0158-480f-9f6d-2d589da3b91e\") " Jan 21 14:47:05 crc kubenswrapper[4720]: I0121 14:47:05.431096 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8161ded5-d8ab-48b7-9c1a-16a7155641d1-operator-scripts\") pod \"8161ded5-d8ab-48b7-9c1a-16a7155641d1\" (UID: \"8161ded5-d8ab-48b7-9c1a-16a7155641d1\") " Jan 21 14:47:05 crc kubenswrapper[4720]: I0121 14:47:05.432113 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4fbe0fa-0158-480f-9f6d-2d589da3b91e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a4fbe0fa-0158-480f-9f6d-2d589da3b91e" (UID: "a4fbe0fa-0158-480f-9f6d-2d589da3b91e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:05 crc kubenswrapper[4720]: I0121 14:47:05.432174 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8161ded5-d8ab-48b7-9c1a-16a7155641d1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8161ded5-d8ab-48b7-9c1a-16a7155641d1" (UID: "8161ded5-d8ab-48b7-9c1a-16a7155641d1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:05 crc kubenswrapper[4720]: I0121 14:47:05.436875 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8161ded5-d8ab-48b7-9c1a-16a7155641d1-kube-api-access-llz2d" (OuterVolumeSpecName: "kube-api-access-llz2d") pod "8161ded5-d8ab-48b7-9c1a-16a7155641d1" (UID: "8161ded5-d8ab-48b7-9c1a-16a7155641d1"). InnerVolumeSpecName "kube-api-access-llz2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:47:05 crc kubenswrapper[4720]: I0121 14:47:05.447128 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4fbe0fa-0158-480f-9f6d-2d589da3b91e-kube-api-access-fggzn" (OuterVolumeSpecName: "kube-api-access-fggzn") pod "a4fbe0fa-0158-480f-9f6d-2d589da3b91e" (UID: "a4fbe0fa-0158-480f-9f6d-2d589da3b91e"). InnerVolumeSpecName "kube-api-access-fggzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:47:05 crc kubenswrapper[4720]: I0121 14:47:05.532913 4720 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4fbe0fa-0158-480f-9f6d-2d589da3b91e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:05 crc kubenswrapper[4720]: I0121 14:47:05.532951 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fggzn\" (UniqueName: \"kubernetes.io/projected/a4fbe0fa-0158-480f-9f6d-2d589da3b91e-kube-api-access-fggzn\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:05 crc kubenswrapper[4720]: I0121 14:47:05.532966 4720 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8161ded5-d8ab-48b7-9c1a-16a7155641d1-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:05 crc kubenswrapper[4720]: I0121 14:47:05.532997 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llz2d\" (UniqueName: \"kubernetes.io/projected/8161ded5-d8ab-48b7-9c1a-16a7155641d1-kube-api-access-llz2d\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:06 crc kubenswrapper[4720]: I0121 14:47:06.247639 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-06a3-account-create-update-dbk66" event={"ID":"8161ded5-d8ab-48b7-9c1a-16a7155641d1","Type":"ContainerDied","Data":"49db172c874956edacb1f12fe4161e695fa0df2db97f11fd7c0e9120811d6732"} Jan 21 14:47:06 crc kubenswrapper[4720]: I0121 14:47:06.247715 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49db172c874956edacb1f12fe4161e695fa0df2db97f11fd7c0e9120811d6732" Jan 21 14:47:06 crc kubenswrapper[4720]: I0121 14:47:06.247782 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-06a3-account-create-update-dbk66" Jan 21 14:47:06 crc kubenswrapper[4720]: I0121 14:47:06.269266 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-318a-account-create-update-lkf6p" event={"ID":"a4fbe0fa-0158-480f-9f6d-2d589da3b91e","Type":"ContainerDied","Data":"2109b20b6f854145c58fbae0d14383fbd564d133a8afbb9af7549a06fd795e90"} Jan 21 14:47:06 crc kubenswrapper[4720]: I0121 14:47:06.269311 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-318a-account-create-update-lkf6p" Jan 21 14:47:06 crc kubenswrapper[4720]: I0121 14:47:06.269321 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2109b20b6f854145c58fbae0d14383fbd564d133a8afbb9af7549a06fd795e90" Jan 21 14:47:08 crc kubenswrapper[4720]: I0121 14:47:08.551574 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-89dxv"] Jan 21 14:47:08 crc kubenswrapper[4720]: E0121 14:47:08.552272 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49f0b95b-6621-43fe-93c2-d4e7704f1f61" containerName="mariadb-database-create" Jan 21 14:47:08 crc kubenswrapper[4720]: I0121 14:47:08.552285 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="49f0b95b-6621-43fe-93c2-d4e7704f1f61" containerName="mariadb-database-create" Jan 21 14:47:08 crc kubenswrapper[4720]: E0121 14:47:08.552296 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8161ded5-d8ab-48b7-9c1a-16a7155641d1" containerName="mariadb-account-create-update" Jan 21 14:47:08 crc kubenswrapper[4720]: I0121 14:47:08.552303 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="8161ded5-d8ab-48b7-9c1a-16a7155641d1" containerName="mariadb-account-create-update" Jan 21 14:47:08 crc kubenswrapper[4720]: E0121 14:47:08.552330 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4fbe0fa-0158-480f-9f6d-2d589da3b91e" containerName="mariadb-account-create-update" Jan 21 14:47:08 crc kubenswrapper[4720]: I0121 14:47:08.552338 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4fbe0fa-0158-480f-9f6d-2d589da3b91e" containerName="mariadb-account-create-update" Jan 21 14:47:08 crc kubenswrapper[4720]: E0121 14:47:08.552352 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="290dffa3-ed33-4571-aeb1-092aae1d8105" containerName="mariadb-database-create" Jan 21 14:47:08 crc kubenswrapper[4720]: I0121 14:47:08.552359 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="290dffa3-ed33-4571-aeb1-092aae1d8105" containerName="mariadb-database-create" Jan 21 14:47:08 crc kubenswrapper[4720]: I0121 14:47:08.552498 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4fbe0fa-0158-480f-9f6d-2d589da3b91e" containerName="mariadb-account-create-update" Jan 21 14:47:08 crc kubenswrapper[4720]: I0121 14:47:08.552516 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="49f0b95b-6621-43fe-93c2-d4e7704f1f61" containerName="mariadb-database-create" Jan 21 14:47:08 crc kubenswrapper[4720]: I0121 14:47:08.552533 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="290dffa3-ed33-4571-aeb1-092aae1d8105" containerName="mariadb-database-create" Jan 21 14:47:08 crc kubenswrapper[4720]: I0121 14:47:08.552542 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="8161ded5-d8ab-48b7-9c1a-16a7155641d1" containerName="mariadb-account-create-update" Jan 21 14:47:08 crc kubenswrapper[4720]: I0121 14:47:08.553089 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-89dxv" Jan 21 14:47:08 crc kubenswrapper[4720]: I0121 14:47:08.555706 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Jan 21 14:47:08 crc kubenswrapper[4720]: I0121 14:47:08.568096 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-89dxv"] Jan 21 14:47:08 crc kubenswrapper[4720]: I0121 14:47:08.683179 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5lc5\" (UniqueName: \"kubernetes.io/projected/1fc2d647-37b6-4437-98fc-1d95af05cfe0-kube-api-access-z5lc5\") pod \"root-account-create-update-89dxv\" (UID: \"1fc2d647-37b6-4437-98fc-1d95af05cfe0\") " pod="openstack/root-account-create-update-89dxv" Jan 21 14:47:08 crc kubenswrapper[4720]: I0121 14:47:08.683335 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1fc2d647-37b6-4437-98fc-1d95af05cfe0-operator-scripts\") pod \"root-account-create-update-89dxv\" (UID: \"1fc2d647-37b6-4437-98fc-1d95af05cfe0\") " pod="openstack/root-account-create-update-89dxv" Jan 21 14:47:08 crc kubenswrapper[4720]: I0121 14:47:08.784273 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5lc5\" (UniqueName: \"kubernetes.io/projected/1fc2d647-37b6-4437-98fc-1d95af05cfe0-kube-api-access-z5lc5\") pod \"root-account-create-update-89dxv\" (UID: \"1fc2d647-37b6-4437-98fc-1d95af05cfe0\") " pod="openstack/root-account-create-update-89dxv" Jan 21 14:47:08 crc kubenswrapper[4720]: I0121 14:47:08.784500 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1fc2d647-37b6-4437-98fc-1d95af05cfe0-operator-scripts\") pod \"root-account-create-update-89dxv\" (UID: \"1fc2d647-37b6-4437-98fc-1d95af05cfe0\") " pod="openstack/root-account-create-update-89dxv" Jan 21 14:47:08 crc kubenswrapper[4720]: I0121 14:47:08.788003 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1fc2d647-37b6-4437-98fc-1d95af05cfe0-operator-scripts\") pod \"root-account-create-update-89dxv\" (UID: \"1fc2d647-37b6-4437-98fc-1d95af05cfe0\") " pod="openstack/root-account-create-update-89dxv" Jan 21 14:47:08 crc kubenswrapper[4720]: I0121 14:47:08.835208 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5lc5\" (UniqueName: \"kubernetes.io/projected/1fc2d647-37b6-4437-98fc-1d95af05cfe0-kube-api-access-z5lc5\") pod \"root-account-create-update-89dxv\" (UID: \"1fc2d647-37b6-4437-98fc-1d95af05cfe0\") " pod="openstack/root-account-create-update-89dxv" Jan 21 14:47:08 crc kubenswrapper[4720]: I0121 14:47:08.870069 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-89dxv" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.347328 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-wpvzs" podUID="95379233-3cd8-4dd3-bf0f-b8198f2258e1" containerName="ovn-controller" probeResult="failure" output=< Jan 21 14:47:10 crc kubenswrapper[4720]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 21 14:47:10 crc kubenswrapper[4720]: > Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.396765 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.402218 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.726306 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-wpvzs-config-5vhqv"] Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.727299 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.729784 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.750308 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wpvzs-config-5vhqv"] Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.814398 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/049894b0-0575-4fc0-bbca-f75722e173af-var-log-ovn\") pod \"ovn-controller-wpvzs-config-5vhqv\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.814493 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/049894b0-0575-4fc0-bbca-f75722e173af-var-run\") pod \"ovn-controller-wpvzs-config-5vhqv\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.814541 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/049894b0-0575-4fc0-bbca-f75722e173af-scripts\") pod \"ovn-controller-wpvzs-config-5vhqv\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.814606 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/049894b0-0575-4fc0-bbca-f75722e173af-var-run-ovn\") pod \"ovn-controller-wpvzs-config-5vhqv\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.814625 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfzq9\" (UniqueName: \"kubernetes.io/projected/049894b0-0575-4fc0-bbca-f75722e173af-kube-api-access-gfzq9\") pod \"ovn-controller-wpvzs-config-5vhqv\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.814711 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/049894b0-0575-4fc0-bbca-f75722e173af-additional-scripts\") pod \"ovn-controller-wpvzs-config-5vhqv\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.916503 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/049894b0-0575-4fc0-bbca-f75722e173af-additional-scripts\") pod \"ovn-controller-wpvzs-config-5vhqv\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.916592 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/049894b0-0575-4fc0-bbca-f75722e173af-var-log-ovn\") pod \"ovn-controller-wpvzs-config-5vhqv\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.916627 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/049894b0-0575-4fc0-bbca-f75722e173af-var-run\") pod \"ovn-controller-wpvzs-config-5vhqv\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.916645 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/049894b0-0575-4fc0-bbca-f75722e173af-scripts\") pod \"ovn-controller-wpvzs-config-5vhqv\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.916697 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/049894b0-0575-4fc0-bbca-f75722e173af-var-run-ovn\") pod \"ovn-controller-wpvzs-config-5vhqv\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.916712 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfzq9\" (UniqueName: \"kubernetes.io/projected/049894b0-0575-4fc0-bbca-f75722e173af-kube-api-access-gfzq9\") pod \"ovn-controller-wpvzs-config-5vhqv\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.917365 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/049894b0-0575-4fc0-bbca-f75722e173af-var-run\") pod \"ovn-controller-wpvzs-config-5vhqv\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.917389 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/049894b0-0575-4fc0-bbca-f75722e173af-var-log-ovn\") pod \"ovn-controller-wpvzs-config-5vhqv\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.917441 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/049894b0-0575-4fc0-bbca-f75722e173af-var-run-ovn\") pod \"ovn-controller-wpvzs-config-5vhqv\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.917454 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/049894b0-0575-4fc0-bbca-f75722e173af-additional-scripts\") pod \"ovn-controller-wpvzs-config-5vhqv\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.918930 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/049894b0-0575-4fc0-bbca-f75722e173af-scripts\") pod \"ovn-controller-wpvzs-config-5vhqv\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.942345 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfzq9\" (UniqueName: \"kubernetes.io/projected/049894b0-0575-4fc0-bbca-f75722e173af-kube-api-access-gfzq9\") pod \"ovn-controller-wpvzs-config-5vhqv\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:11 crc kubenswrapper[4720]: I0121 14:47:11.055034 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:11 crc kubenswrapper[4720]: I0121 14:47:11.309804 4720 generic.go:334] "Generic (PLEG): container finished" podID="c1752995-abec-46de-adf8-da9e3ed99d4a" containerID="c805233f5325caf425e355c639bbb38416823bf3012c2a9fbf778f7b0bf437ea" exitCode=0 Jan 21 14:47:11 crc kubenswrapper[4720]: I0121 14:47:11.309854 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c1752995-abec-46de-adf8-da9e3ed99d4a","Type":"ContainerDied","Data":"c805233f5325caf425e355c639bbb38416823bf3012c2a9fbf778f7b0bf437ea"} Jan 21 14:47:11 crc kubenswrapper[4720]: I0121 14:47:11.312098 4720 generic.go:334] "Generic (PLEG): container finished" podID="3a2eafda-c352-4311-94d5-a1aec1422699" containerID="c4453d3c9ef59902e453daa4adc4cd400e16b0fd0ef2955bff89215fad4b9aed" exitCode=0 Jan 21 14:47:11 crc kubenswrapper[4720]: I0121 14:47:11.312787 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3a2eafda-c352-4311-94d5-a1aec1422699","Type":"ContainerDied","Data":"c4453d3c9ef59902e453daa4adc4cd400e16b0fd0ef2955bff89215fad4b9aed"} Jan 21 14:47:15 crc kubenswrapper[4720]: I0121 14:47:15.337636 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-wpvzs" podUID="95379233-3cd8-4dd3-bf0f-b8198f2258e1" containerName="ovn-controller" probeResult="failure" output=< Jan 21 14:47:15 crc kubenswrapper[4720]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 21 14:47:15 crc kubenswrapper[4720]: > Jan 21 14:47:18 crc kubenswrapper[4720]: E0121 14:47:18.375680 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Jan 21 14:47:18 crc kubenswrapper[4720]: E0121 14:47:18.376429 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6m45w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-dtj5w_openstack(c40c650e-a05e-4cc0-88fa-d56eae92d29a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:47:18 crc kubenswrapper[4720]: E0121 14:47:18.378475 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-dtj5w" podUID="c40c650e-a05e-4cc0-88fa-d56eae92d29a" Jan 21 14:47:18 crc kubenswrapper[4720]: E0121 14:47:18.412782 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-dtj5w" podUID="c40c650e-a05e-4cc0-88fa-d56eae92d29a" Jan 21 14:47:18 crc kubenswrapper[4720]: I0121 14:47:18.733038 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-89dxv"] Jan 21 14:47:18 crc kubenswrapper[4720]: I0121 14:47:18.887716 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wpvzs-config-5vhqv"] Jan 21 14:47:18 crc kubenswrapper[4720]: W0121 14:47:18.887814 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod049894b0_0575_4fc0_bbca_f75722e173af.slice/crio-388ec9bba1e28c15c50ec8b57590f559224ac51494f8e33c74a1a528e3d13c8c WatchSource:0}: Error finding container 388ec9bba1e28c15c50ec8b57590f559224ac51494f8e33c74a1a528e3d13c8c: Status 404 returned error can't find the container with id 388ec9bba1e28c15c50ec8b57590f559224ac51494f8e33c74a1a528e3d13c8c Jan 21 14:47:19 crc kubenswrapper[4720]: I0121 14:47:19.411692 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3a2eafda-c352-4311-94d5-a1aec1422699","Type":"ContainerStarted","Data":"41acd62d6994c3b333557260be3b41ae84ff11452b3f18db90c86f45eaee7f6c"} Jan 21 14:47:19 crc kubenswrapper[4720]: I0121 14:47:19.412101 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 21 14:47:19 crc kubenswrapper[4720]: I0121 14:47:19.413088 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wpvzs-config-5vhqv" event={"ID":"049894b0-0575-4fc0-bbca-f75722e173af","Type":"ContainerStarted","Data":"7fda82afe9e25635d25bfab63eae235397df92725d98016475c28391c7bd5687"} Jan 21 14:47:19 crc kubenswrapper[4720]: I0121 14:47:19.413114 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wpvzs-config-5vhqv" event={"ID":"049894b0-0575-4fc0-bbca-f75722e173af","Type":"ContainerStarted","Data":"388ec9bba1e28c15c50ec8b57590f559224ac51494f8e33c74a1a528e3d13c8c"} Jan 21 14:47:19 crc kubenswrapper[4720]: I0121 14:47:19.414602 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-89dxv" event={"ID":"1fc2d647-37b6-4437-98fc-1d95af05cfe0","Type":"ContainerStarted","Data":"bbdc74de2b9aa9d89088725acd4c82b08706e4b50492cfbb262eba1e6a3ade4a"} Jan 21 14:47:19 crc kubenswrapper[4720]: I0121 14:47:19.414634 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-89dxv" event={"ID":"1fc2d647-37b6-4437-98fc-1d95af05cfe0","Type":"ContainerStarted","Data":"7f4155d41737ffdf00447905247811a40e9ef38968723dae87057b3c2f1c49f8"} Jan 21 14:47:19 crc kubenswrapper[4720]: I0121 14:47:19.417086 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c1752995-abec-46de-adf8-da9e3ed99d4a","Type":"ContainerStarted","Data":"9c861cf27787d0df1915de176ea7b338ba9e65e509d7002abe91b7eb691fa61e"} Jan 21 14:47:19 crc kubenswrapper[4720]: I0121 14:47:19.417499 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:47:19 crc kubenswrapper[4720]: I0121 14:47:19.462684 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=76.829157744 podStartE2EDuration="1m24.462669615s" podCreationTimestamp="2026-01-21 14:45:55 +0000 UTC" firstStartedPulling="2026-01-21 14:46:30.32224961 +0000 UTC m=+1028.230989542" lastFinishedPulling="2026-01-21 14:46:37.955761481 +0000 UTC m=+1035.864501413" observedRunningTime="2026-01-21 14:47:19.436041519 +0000 UTC m=+1077.344781461" watchObservedRunningTime="2026-01-21 14:47:19.462669615 +0000 UTC m=+1077.371409557" Jan 21 14:47:19 crc kubenswrapper[4720]: I0121 14:47:19.465543 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=74.599967347 podStartE2EDuration="1m23.465531223s" podCreationTimestamp="2026-01-21 14:45:56 +0000 UTC" firstStartedPulling="2026-01-21 14:46:28.577809154 +0000 UTC m=+1026.486549086" lastFinishedPulling="2026-01-21 14:46:37.44337303 +0000 UTC m=+1035.352112962" observedRunningTime="2026-01-21 14:47:19.457188516 +0000 UTC m=+1077.365928468" watchObservedRunningTime="2026-01-21 14:47:19.465531223 +0000 UTC m=+1077.374271155" Jan 21 14:47:19 crc kubenswrapper[4720]: I0121 14:47:19.478549 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-wpvzs-config-5vhqv" podStartSLOduration=9.478532977 podStartE2EDuration="9.478532977s" podCreationTimestamp="2026-01-21 14:47:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:47:19.47423851 +0000 UTC m=+1077.382978452" watchObservedRunningTime="2026-01-21 14:47:19.478532977 +0000 UTC m=+1077.387272909" Jan 21 14:47:19 crc kubenswrapper[4720]: I0121 14:47:19.489491 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-89dxv" podStartSLOduration=11.489476596 podStartE2EDuration="11.489476596s" podCreationTimestamp="2026-01-21 14:47:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:47:19.487883283 +0000 UTC m=+1077.396623235" watchObservedRunningTime="2026-01-21 14:47:19.489476596 +0000 UTC m=+1077.398216528" Jan 21 14:47:20 crc kubenswrapper[4720]: I0121 14:47:20.370004 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-wpvzs" Jan 21 14:47:20 crc kubenswrapper[4720]: I0121 14:47:20.439258 4720 generic.go:334] "Generic (PLEG): container finished" podID="049894b0-0575-4fc0-bbca-f75722e173af" containerID="7fda82afe9e25635d25bfab63eae235397df92725d98016475c28391c7bd5687" exitCode=0 Jan 21 14:47:20 crc kubenswrapper[4720]: I0121 14:47:20.439352 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wpvzs-config-5vhqv" event={"ID":"049894b0-0575-4fc0-bbca-f75722e173af","Type":"ContainerDied","Data":"7fda82afe9e25635d25bfab63eae235397df92725d98016475c28391c7bd5687"} Jan 21 14:47:20 crc kubenswrapper[4720]: I0121 14:47:20.447940 4720 generic.go:334] "Generic (PLEG): container finished" podID="1fc2d647-37b6-4437-98fc-1d95af05cfe0" containerID="bbdc74de2b9aa9d89088725acd4c82b08706e4b50492cfbb262eba1e6a3ade4a" exitCode=0 Jan 21 14:47:20 crc kubenswrapper[4720]: I0121 14:47:20.449043 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-89dxv" event={"ID":"1fc2d647-37b6-4437-98fc-1d95af05cfe0","Type":"ContainerDied","Data":"bbdc74de2b9aa9d89088725acd4c82b08706e4b50492cfbb262eba1e6a3ade4a"} Jan 21 14:47:21 crc kubenswrapper[4720]: I0121 14:47:21.817692 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-89dxv" Jan 21 14:47:21 crc kubenswrapper[4720]: I0121 14:47:21.902963 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:21 crc kubenswrapper[4720]: I0121 14:47:21.938455 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5lc5\" (UniqueName: \"kubernetes.io/projected/1fc2d647-37b6-4437-98fc-1d95af05cfe0-kube-api-access-z5lc5\") pod \"1fc2d647-37b6-4437-98fc-1d95af05cfe0\" (UID: \"1fc2d647-37b6-4437-98fc-1d95af05cfe0\") " Jan 21 14:47:21 crc kubenswrapper[4720]: I0121 14:47:21.938583 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1fc2d647-37b6-4437-98fc-1d95af05cfe0-operator-scripts\") pod \"1fc2d647-37b6-4437-98fc-1d95af05cfe0\" (UID: \"1fc2d647-37b6-4437-98fc-1d95af05cfe0\") " Jan 21 14:47:21 crc kubenswrapper[4720]: I0121 14:47:21.939351 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fc2d647-37b6-4437-98fc-1d95af05cfe0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1fc2d647-37b6-4437-98fc-1d95af05cfe0" (UID: "1fc2d647-37b6-4437-98fc-1d95af05cfe0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:21 crc kubenswrapper[4720]: I0121 14:47:21.953511 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fc2d647-37b6-4437-98fc-1d95af05cfe0-kube-api-access-z5lc5" (OuterVolumeSpecName: "kube-api-access-z5lc5") pod "1fc2d647-37b6-4437-98fc-1d95af05cfe0" (UID: "1fc2d647-37b6-4437-98fc-1d95af05cfe0"). InnerVolumeSpecName "kube-api-access-z5lc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.040320 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/049894b0-0575-4fc0-bbca-f75722e173af-additional-scripts\") pod \"049894b0-0575-4fc0-bbca-f75722e173af\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.040435 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/049894b0-0575-4fc0-bbca-f75722e173af-var-run\") pod \"049894b0-0575-4fc0-bbca-f75722e173af\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.040464 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/049894b0-0575-4fc0-bbca-f75722e173af-scripts\") pod \"049894b0-0575-4fc0-bbca-f75722e173af\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.040518 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/049894b0-0575-4fc0-bbca-f75722e173af-var-run" (OuterVolumeSpecName: "var-run") pod "049894b0-0575-4fc0-bbca-f75722e173af" (UID: "049894b0-0575-4fc0-bbca-f75722e173af"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.040557 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfzq9\" (UniqueName: \"kubernetes.io/projected/049894b0-0575-4fc0-bbca-f75722e173af-kube-api-access-gfzq9\") pod \"049894b0-0575-4fc0-bbca-f75722e173af\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.040997 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/049894b0-0575-4fc0-bbca-f75722e173af-var-log-ovn\") pod \"049894b0-0575-4fc0-bbca-f75722e173af\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.041036 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/049894b0-0575-4fc0-bbca-f75722e173af-var-run-ovn\") pod \"049894b0-0575-4fc0-bbca-f75722e173af\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.041040 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/049894b0-0575-4fc0-bbca-f75722e173af-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "049894b0-0575-4fc0-bbca-f75722e173af" (UID: "049894b0-0575-4fc0-bbca-f75722e173af"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.041095 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/049894b0-0575-4fc0-bbca-f75722e173af-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "049894b0-0575-4fc0-bbca-f75722e173af" (UID: "049894b0-0575-4fc0-bbca-f75722e173af"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.041155 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/049894b0-0575-4fc0-bbca-f75722e173af-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "049894b0-0575-4fc0-bbca-f75722e173af" (UID: "049894b0-0575-4fc0-bbca-f75722e173af"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.041349 4720 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/049894b0-0575-4fc0-bbca-f75722e173af-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.041364 4720 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/049894b0-0575-4fc0-bbca-f75722e173af-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.041375 4720 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/049894b0-0575-4fc0-bbca-f75722e173af-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.041388 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5lc5\" (UniqueName: \"kubernetes.io/projected/1fc2d647-37b6-4437-98fc-1d95af05cfe0-kube-api-access-z5lc5\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.041399 4720 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/049894b0-0575-4fc0-bbca-f75722e173af-var-run\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.041410 4720 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1fc2d647-37b6-4437-98fc-1d95af05cfe0-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.041456 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/049894b0-0575-4fc0-bbca-f75722e173af-scripts" (OuterVolumeSpecName: "scripts") pod "049894b0-0575-4fc0-bbca-f75722e173af" (UID: "049894b0-0575-4fc0-bbca-f75722e173af"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.046230 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/049894b0-0575-4fc0-bbca-f75722e173af-kube-api-access-gfzq9" (OuterVolumeSpecName: "kube-api-access-gfzq9") pod "049894b0-0575-4fc0-bbca-f75722e173af" (UID: "049894b0-0575-4fc0-bbca-f75722e173af"). InnerVolumeSpecName "kube-api-access-gfzq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.142768 4720 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/049894b0-0575-4fc0-bbca-f75722e173af-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.142798 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfzq9\" (UniqueName: \"kubernetes.io/projected/049894b0-0575-4fc0-bbca-f75722e173af-kube-api-access-gfzq9\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.462835 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wpvzs-config-5vhqv" event={"ID":"049894b0-0575-4fc0-bbca-f75722e173af","Type":"ContainerDied","Data":"388ec9bba1e28c15c50ec8b57590f559224ac51494f8e33c74a1a528e3d13c8c"} Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.462881 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="388ec9bba1e28c15c50ec8b57590f559224ac51494f8e33c74a1a528e3d13c8c" Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.462880 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.464410 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-89dxv" event={"ID":"1fc2d647-37b6-4437-98fc-1d95af05cfe0","Type":"ContainerDied","Data":"7f4155d41737ffdf00447905247811a40e9ef38968723dae87057b3c2f1c49f8"} Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.464443 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f4155d41737ffdf00447905247811a40e9ef38968723dae87057b3c2f1c49f8" Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.464489 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-89dxv" Jan 21 14:47:23 crc kubenswrapper[4720]: I0121 14:47:23.020741 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-wpvzs-config-5vhqv"] Jan 21 14:47:23 crc kubenswrapper[4720]: I0121 14:47:23.029624 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-wpvzs-config-5vhqv"] Jan 21 14:47:24 crc kubenswrapper[4720]: I0121 14:47:24.694840 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="049894b0-0575-4fc0-bbca-f75722e173af" path="/var/lib/kubelet/pods/049894b0-0575-4fc0-bbca-f75722e173af/volumes" Jan 21 14:47:31 crc kubenswrapper[4720]: I0121 14:47:31.529762 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-dtj5w" event={"ID":"c40c650e-a05e-4cc0-88fa-d56eae92d29a","Type":"ContainerStarted","Data":"8d5a885edcd4e22f1c2c16df333a61bd50d3383f3347aa464336e86a726533ed"} Jan 21 14:47:31 crc kubenswrapper[4720]: I0121 14:47:31.550238 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-dtj5w" podStartSLOduration=2.920606295 podStartE2EDuration="31.550220566s" podCreationTimestamp="2026-01-21 14:47:00 +0000 UTC" firstStartedPulling="2026-01-21 14:47:01.57154684 +0000 UTC m=+1059.480286772" lastFinishedPulling="2026-01-21 14:47:30.201161091 +0000 UTC m=+1088.109901043" observedRunningTime="2026-01-21 14:47:31.545414185 +0000 UTC m=+1089.454154127" watchObservedRunningTime="2026-01-21 14:47:31.550220566 +0000 UTC m=+1089.458960498" Jan 21 14:47:37 crc kubenswrapper[4720]: I0121 14:47:37.447874 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 21 14:47:37 crc kubenswrapper[4720]: I0121 14:47:37.592231 4720 generic.go:334] "Generic (PLEG): container finished" podID="c40c650e-a05e-4cc0-88fa-d56eae92d29a" containerID="8d5a885edcd4e22f1c2c16df333a61bd50d3383f3347aa464336e86a726533ed" exitCode=0 Jan 21 14:47:37 crc kubenswrapper[4720]: I0121 14:47:37.592283 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-dtj5w" event={"ID":"c40c650e-a05e-4cc0-88fa-d56eae92d29a","Type":"ContainerDied","Data":"8d5a885edcd4e22f1c2c16df333a61bd50d3383f3347aa464336e86a726533ed"} Jan 21 14:47:37 crc kubenswrapper[4720]: I0121 14:47:37.871874 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.024714 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-qjpx9"] Jan 21 14:47:38 crc kubenswrapper[4720]: E0121 14:47:38.025316 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fc2d647-37b6-4437-98fc-1d95af05cfe0" containerName="mariadb-account-create-update" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.025337 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fc2d647-37b6-4437-98fc-1d95af05cfe0" containerName="mariadb-account-create-update" Jan 21 14:47:38 crc kubenswrapper[4720]: E0121 14:47:38.025356 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="049894b0-0575-4fc0-bbca-f75722e173af" containerName="ovn-config" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.025362 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="049894b0-0575-4fc0-bbca-f75722e173af" containerName="ovn-config" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.025538 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fc2d647-37b6-4437-98fc-1d95af05cfe0" containerName="mariadb-account-create-update" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.025559 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="049894b0-0575-4fc0-bbca-f75722e173af" containerName="ovn-config" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.026146 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qjpx9" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.040715 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-b1e4-account-create-update-qtmr9"] Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.041809 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b1e4-account-create-update-qtmr9" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.051059 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.053778 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-qjpx9"] Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.072704 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b1e4-account-create-update-qtmr9"] Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.098409 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-pmrgf"] Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.099350 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-pmrgf" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.115679 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-pmrgf"] Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.153462 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lmjm\" (UniqueName: \"kubernetes.io/projected/5ffa29ff-07bd-40cc-9853-a484f79b382f-kube-api-access-7lmjm\") pod \"barbican-db-create-pmrgf\" (UID: \"5ffa29ff-07bd-40cc-9853-a484f79b382f\") " pod="openstack/barbican-db-create-pmrgf" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.153526 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/077e6634-d42f-4765-ab65-9e24cf21a047-operator-scripts\") pod \"cinder-db-create-qjpx9\" (UID: \"077e6634-d42f-4765-ab65-9e24cf21a047\") " pod="openstack/cinder-db-create-qjpx9" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.153550 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvmlq\" (UniqueName: \"kubernetes.io/projected/077e6634-d42f-4765-ab65-9e24cf21a047-kube-api-access-tvmlq\") pod \"cinder-db-create-qjpx9\" (UID: \"077e6634-d42f-4765-ab65-9e24cf21a047\") " pod="openstack/cinder-db-create-qjpx9" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.153593 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8da5c3a6-e588-412a-b884-7875fe439e61-operator-scripts\") pod \"barbican-b1e4-account-create-update-qtmr9\" (UID: \"8da5c3a6-e588-412a-b884-7875fe439e61\") " pod="openstack/barbican-b1e4-account-create-update-qtmr9" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.153621 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ffa29ff-07bd-40cc-9853-a484f79b382f-operator-scripts\") pod \"barbican-db-create-pmrgf\" (UID: \"5ffa29ff-07bd-40cc-9853-a484f79b382f\") " pod="openstack/barbican-db-create-pmrgf" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.153639 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j6rk\" (UniqueName: \"kubernetes.io/projected/8da5c3a6-e588-412a-b884-7875fe439e61-kube-api-access-8j6rk\") pod \"barbican-b1e4-account-create-update-qtmr9\" (UID: \"8da5c3a6-e588-412a-b884-7875fe439e61\") " pod="openstack/barbican-b1e4-account-create-update-qtmr9" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.181113 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-43f1-account-create-update-bsqmb"] Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.182101 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-43f1-account-create-update-bsqmb" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.190412 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.198181 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-43f1-account-create-update-bsqmb"] Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.254698 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8da5c3a6-e588-412a-b884-7875fe439e61-operator-scripts\") pod \"barbican-b1e4-account-create-update-qtmr9\" (UID: \"8da5c3a6-e588-412a-b884-7875fe439e61\") " pod="openstack/barbican-b1e4-account-create-update-qtmr9" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.254747 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ffa29ff-07bd-40cc-9853-a484f79b382f-operator-scripts\") pod \"barbican-db-create-pmrgf\" (UID: \"5ffa29ff-07bd-40cc-9853-a484f79b382f\") " pod="openstack/barbican-db-create-pmrgf" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.254765 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j6rk\" (UniqueName: \"kubernetes.io/projected/8da5c3a6-e588-412a-b884-7875fe439e61-kube-api-access-8j6rk\") pod \"barbican-b1e4-account-create-update-qtmr9\" (UID: \"8da5c3a6-e588-412a-b884-7875fe439e61\") " pod="openstack/barbican-b1e4-account-create-update-qtmr9" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.254795 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6545ddce-5b65-4702-9dee-2f2d9644123e-operator-scripts\") pod \"cinder-43f1-account-create-update-bsqmb\" (UID: \"6545ddce-5b65-4702-9dee-2f2d9644123e\") " pod="openstack/cinder-43f1-account-create-update-bsqmb" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.254876 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lmjm\" (UniqueName: \"kubernetes.io/projected/5ffa29ff-07bd-40cc-9853-a484f79b382f-kube-api-access-7lmjm\") pod \"barbican-db-create-pmrgf\" (UID: \"5ffa29ff-07bd-40cc-9853-a484f79b382f\") " pod="openstack/barbican-db-create-pmrgf" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.254906 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/077e6634-d42f-4765-ab65-9e24cf21a047-operator-scripts\") pod \"cinder-db-create-qjpx9\" (UID: \"077e6634-d42f-4765-ab65-9e24cf21a047\") " pod="openstack/cinder-db-create-qjpx9" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.254929 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvmlq\" (UniqueName: \"kubernetes.io/projected/077e6634-d42f-4765-ab65-9e24cf21a047-kube-api-access-tvmlq\") pod \"cinder-db-create-qjpx9\" (UID: \"077e6634-d42f-4765-ab65-9e24cf21a047\") " pod="openstack/cinder-db-create-qjpx9" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.254954 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2246\" (UniqueName: \"kubernetes.io/projected/6545ddce-5b65-4702-9dee-2f2d9644123e-kube-api-access-n2246\") pod \"cinder-43f1-account-create-update-bsqmb\" (UID: \"6545ddce-5b65-4702-9dee-2f2d9644123e\") " pod="openstack/cinder-43f1-account-create-update-bsqmb" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.255608 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8da5c3a6-e588-412a-b884-7875fe439e61-operator-scripts\") pod \"barbican-b1e4-account-create-update-qtmr9\" (UID: \"8da5c3a6-e588-412a-b884-7875fe439e61\") " pod="openstack/barbican-b1e4-account-create-update-qtmr9" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.256139 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ffa29ff-07bd-40cc-9853-a484f79b382f-operator-scripts\") pod \"barbican-db-create-pmrgf\" (UID: \"5ffa29ff-07bd-40cc-9853-a484f79b382f\") " pod="openstack/barbican-db-create-pmrgf" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.256219 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/077e6634-d42f-4765-ab65-9e24cf21a047-operator-scripts\") pod \"cinder-db-create-qjpx9\" (UID: \"077e6634-d42f-4765-ab65-9e24cf21a047\") " pod="openstack/cinder-db-create-qjpx9" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.278733 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lmjm\" (UniqueName: \"kubernetes.io/projected/5ffa29ff-07bd-40cc-9853-a484f79b382f-kube-api-access-7lmjm\") pod \"barbican-db-create-pmrgf\" (UID: \"5ffa29ff-07bd-40cc-9853-a484f79b382f\") " pod="openstack/barbican-db-create-pmrgf" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.283020 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvmlq\" (UniqueName: \"kubernetes.io/projected/077e6634-d42f-4765-ab65-9e24cf21a047-kube-api-access-tvmlq\") pod \"cinder-db-create-qjpx9\" (UID: \"077e6634-d42f-4765-ab65-9e24cf21a047\") " pod="openstack/cinder-db-create-qjpx9" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.288397 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j6rk\" (UniqueName: \"kubernetes.io/projected/8da5c3a6-e588-412a-b884-7875fe439e61-kube-api-access-8j6rk\") pod \"barbican-b1e4-account-create-update-qtmr9\" (UID: \"8da5c3a6-e588-412a-b884-7875fe439e61\") " pod="openstack/barbican-b1e4-account-create-update-qtmr9" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.300299 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-f47pm"] Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.301245 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-f47pm" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.304486 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pb4wb" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.304596 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.304947 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.304997 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.314343 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-f47pm"] Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.355911 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2246\" (UniqueName: \"kubernetes.io/projected/6545ddce-5b65-4702-9dee-2f2d9644123e-kube-api-access-n2246\") pod \"cinder-43f1-account-create-update-bsqmb\" (UID: \"6545ddce-5b65-4702-9dee-2f2d9644123e\") " pod="openstack/cinder-43f1-account-create-update-bsqmb" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.355972 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6545ddce-5b65-4702-9dee-2f2d9644123e-operator-scripts\") pod \"cinder-43f1-account-create-update-bsqmb\" (UID: \"6545ddce-5b65-4702-9dee-2f2d9644123e\") " pod="openstack/cinder-43f1-account-create-update-bsqmb" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.355999 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17-config-data\") pod \"keystone-db-sync-f47pm\" (UID: \"3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17\") " pod="openstack/keystone-db-sync-f47pm" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.356031 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17-combined-ca-bundle\") pod \"keystone-db-sync-f47pm\" (UID: \"3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17\") " pod="openstack/keystone-db-sync-f47pm" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.356061 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnhtt\" (UniqueName: \"kubernetes.io/projected/3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17-kube-api-access-fnhtt\") pod \"keystone-db-sync-f47pm\" (UID: \"3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17\") " pod="openstack/keystone-db-sync-f47pm" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.356793 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6545ddce-5b65-4702-9dee-2f2d9644123e-operator-scripts\") pod \"cinder-43f1-account-create-update-bsqmb\" (UID: \"6545ddce-5b65-4702-9dee-2f2d9644123e\") " pod="openstack/cinder-43f1-account-create-update-bsqmb" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.376183 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-md2wm"] Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.377014 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2246\" (UniqueName: \"kubernetes.io/projected/6545ddce-5b65-4702-9dee-2f2d9644123e-kube-api-access-n2246\") pod \"cinder-43f1-account-create-update-bsqmb\" (UID: \"6545ddce-5b65-4702-9dee-2f2d9644123e\") " pod="openstack/cinder-43f1-account-create-update-bsqmb" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.382951 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qjpx9" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.392737 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-md2wm"] Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.392835 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-md2wm" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.393748 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b1e4-account-create-update-qtmr9" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.430125 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-pmrgf" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.457069 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17-config-data\") pod \"keystone-db-sync-f47pm\" (UID: \"3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17\") " pod="openstack/keystone-db-sync-f47pm" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.457118 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17-combined-ca-bundle\") pod \"keystone-db-sync-f47pm\" (UID: \"3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17\") " pod="openstack/keystone-db-sync-f47pm" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.457142 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tvcs\" (UniqueName: \"kubernetes.io/projected/d3a4204b-d91a-4d30-bea2-c327b452b61a-kube-api-access-2tvcs\") pod \"neutron-db-create-md2wm\" (UID: \"d3a4204b-d91a-4d30-bea2-c327b452b61a\") " pod="openstack/neutron-db-create-md2wm" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.457173 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnhtt\" (UniqueName: \"kubernetes.io/projected/3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17-kube-api-access-fnhtt\") pod \"keystone-db-sync-f47pm\" (UID: \"3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17\") " pod="openstack/keystone-db-sync-f47pm" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.457211 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3a4204b-d91a-4d30-bea2-c327b452b61a-operator-scripts\") pod \"neutron-db-create-md2wm\" (UID: \"d3a4204b-d91a-4d30-bea2-c327b452b61a\") " pod="openstack/neutron-db-create-md2wm" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.463616 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17-config-data\") pod \"keystone-db-sync-f47pm\" (UID: \"3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17\") " pod="openstack/keystone-db-sync-f47pm" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.464140 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17-combined-ca-bundle\") pod \"keystone-db-sync-f47pm\" (UID: \"3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17\") " pod="openstack/keystone-db-sync-f47pm" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.489896 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnhtt\" (UniqueName: \"kubernetes.io/projected/3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17-kube-api-access-fnhtt\") pod \"keystone-db-sync-f47pm\" (UID: \"3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17\") " pod="openstack/keystone-db-sync-f47pm" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.502107 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-43f1-account-create-update-bsqmb" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.559500 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tvcs\" (UniqueName: \"kubernetes.io/projected/d3a4204b-d91a-4d30-bea2-c327b452b61a-kube-api-access-2tvcs\") pod \"neutron-db-create-md2wm\" (UID: \"d3a4204b-d91a-4d30-bea2-c327b452b61a\") " pod="openstack/neutron-db-create-md2wm" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.559929 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3a4204b-d91a-4d30-bea2-c327b452b61a-operator-scripts\") pod \"neutron-db-create-md2wm\" (UID: \"d3a4204b-d91a-4d30-bea2-c327b452b61a\") " pod="openstack/neutron-db-create-md2wm" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.560539 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3a4204b-d91a-4d30-bea2-c327b452b61a-operator-scripts\") pod \"neutron-db-create-md2wm\" (UID: \"d3a4204b-d91a-4d30-bea2-c327b452b61a\") " pod="openstack/neutron-db-create-md2wm" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.604120 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tvcs\" (UniqueName: \"kubernetes.io/projected/d3a4204b-d91a-4d30-bea2-c327b452b61a-kube-api-access-2tvcs\") pod \"neutron-db-create-md2wm\" (UID: \"d3a4204b-d91a-4d30-bea2-c327b452b61a\") " pod="openstack/neutron-db-create-md2wm" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.614593 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-9105-account-create-update-h4nvp"] Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.615620 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9105-account-create-update-h4nvp" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.617933 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.638133 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9105-account-create-update-h4nvp"] Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.656299 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-f47pm" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.661970 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82f9f1ca-7fe3-4e17-8393-20364149010d-operator-scripts\") pod \"neutron-9105-account-create-update-h4nvp\" (UID: \"82f9f1ca-7fe3-4e17-8393-20364149010d\") " pod="openstack/neutron-9105-account-create-update-h4nvp" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.662039 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqgdt\" (UniqueName: \"kubernetes.io/projected/82f9f1ca-7fe3-4e17-8393-20364149010d-kube-api-access-fqgdt\") pod \"neutron-9105-account-create-update-h4nvp\" (UID: \"82f9f1ca-7fe3-4e17-8393-20364149010d\") " pod="openstack/neutron-9105-account-create-update-h4nvp" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.763088 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqgdt\" (UniqueName: \"kubernetes.io/projected/82f9f1ca-7fe3-4e17-8393-20364149010d-kube-api-access-fqgdt\") pod \"neutron-9105-account-create-update-h4nvp\" (UID: \"82f9f1ca-7fe3-4e17-8393-20364149010d\") " pod="openstack/neutron-9105-account-create-update-h4nvp" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.763287 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82f9f1ca-7fe3-4e17-8393-20364149010d-operator-scripts\") pod \"neutron-9105-account-create-update-h4nvp\" (UID: \"82f9f1ca-7fe3-4e17-8393-20364149010d\") " pod="openstack/neutron-9105-account-create-update-h4nvp" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.764019 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82f9f1ca-7fe3-4e17-8393-20364149010d-operator-scripts\") pod \"neutron-9105-account-create-update-h4nvp\" (UID: \"82f9f1ca-7fe3-4e17-8393-20364149010d\") " pod="openstack/neutron-9105-account-create-update-h4nvp" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.783526 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqgdt\" (UniqueName: \"kubernetes.io/projected/82f9f1ca-7fe3-4e17-8393-20364149010d-kube-api-access-fqgdt\") pod \"neutron-9105-account-create-update-h4nvp\" (UID: \"82f9f1ca-7fe3-4e17-8393-20364149010d\") " pod="openstack/neutron-9105-account-create-update-h4nvp" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.783854 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-md2wm" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.951174 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9105-account-create-update-h4nvp" Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.055778 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-qjpx9"] Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.061834 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b1e4-account-create-update-qtmr9"] Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.230612 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-dtj5w" Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.281222 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c40c650e-a05e-4cc0-88fa-d56eae92d29a-db-sync-config-data\") pod \"c40c650e-a05e-4cc0-88fa-d56eae92d29a\" (UID: \"c40c650e-a05e-4cc0-88fa-d56eae92d29a\") " Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.281273 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6m45w\" (UniqueName: \"kubernetes.io/projected/c40c650e-a05e-4cc0-88fa-d56eae92d29a-kube-api-access-6m45w\") pod \"c40c650e-a05e-4cc0-88fa-d56eae92d29a\" (UID: \"c40c650e-a05e-4cc0-88fa-d56eae92d29a\") " Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.281380 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c40c650e-a05e-4cc0-88fa-d56eae92d29a-config-data\") pod \"c40c650e-a05e-4cc0-88fa-d56eae92d29a\" (UID: \"c40c650e-a05e-4cc0-88fa-d56eae92d29a\") " Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.281422 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c40c650e-a05e-4cc0-88fa-d56eae92d29a-combined-ca-bundle\") pod \"c40c650e-a05e-4cc0-88fa-d56eae92d29a\" (UID: \"c40c650e-a05e-4cc0-88fa-d56eae92d29a\") " Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.282271 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-43f1-account-create-update-bsqmb"] Jan 21 14:47:39 crc kubenswrapper[4720]: W0121 14:47:39.309123 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6545ddce_5b65_4702_9dee_2f2d9644123e.slice/crio-528bbf4e75418d1c76364b9b2f67132ee68b2d967c83d53f51bfc164400ee8dc WatchSource:0}: Error finding container 528bbf4e75418d1c76364b9b2f67132ee68b2d967c83d53f51bfc164400ee8dc: Status 404 returned error can't find the container with id 528bbf4e75418d1c76364b9b2f67132ee68b2d967c83d53f51bfc164400ee8dc Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.317367 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c40c650e-a05e-4cc0-88fa-d56eae92d29a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c40c650e-a05e-4cc0-88fa-d56eae92d29a" (UID: "c40c650e-a05e-4cc0-88fa-d56eae92d29a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.317986 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c40c650e-a05e-4cc0-88fa-d56eae92d29a-kube-api-access-6m45w" (OuterVolumeSpecName: "kube-api-access-6m45w") pod "c40c650e-a05e-4cc0-88fa-d56eae92d29a" (UID: "c40c650e-a05e-4cc0-88fa-d56eae92d29a"). InnerVolumeSpecName "kube-api-access-6m45w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.320721 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-pmrgf"] Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.370802 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c40c650e-a05e-4cc0-88fa-d56eae92d29a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c40c650e-a05e-4cc0-88fa-d56eae92d29a" (UID: "c40c650e-a05e-4cc0-88fa-d56eae92d29a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.384006 4720 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c40c650e-a05e-4cc0-88fa-d56eae92d29a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.384031 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6m45w\" (UniqueName: \"kubernetes.io/projected/c40c650e-a05e-4cc0-88fa-d56eae92d29a-kube-api-access-6m45w\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.384042 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c40c650e-a05e-4cc0-88fa-d56eae92d29a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.488448 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c40c650e-a05e-4cc0-88fa-d56eae92d29a-config-data" (OuterVolumeSpecName: "config-data") pod "c40c650e-a05e-4cc0-88fa-d56eae92d29a" (UID: "c40c650e-a05e-4cc0-88fa-d56eae92d29a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.490708 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c40c650e-a05e-4cc0-88fa-d56eae92d29a-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.511243 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-md2wm"] Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.517079 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-f47pm"] Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.523736 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9105-account-create-update-h4nvp"] Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.619721 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9105-account-create-update-h4nvp" event={"ID":"82f9f1ca-7fe3-4e17-8393-20364149010d","Type":"ContainerStarted","Data":"44b9f91c267339bfc8b0884fc8c5c2ef790947f87b24723a7a09d88ebc795b96"} Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.627847 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-pmrgf" event={"ID":"5ffa29ff-07bd-40cc-9853-a484f79b382f","Type":"ContainerStarted","Data":"a8464f3faefc31f591d60cd9bb491ee725d1832097b4b460ab3f30832c1f9786"} Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.647411 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-dtj5w" event={"ID":"c40c650e-a05e-4cc0-88fa-d56eae92d29a","Type":"ContainerDied","Data":"cf896b0a6c15a078a1d5ea17910a17801f89b537d091ad548be61dbf899f72a0"} Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.647452 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf896b0a6c15a078a1d5ea17910a17801f89b537d091ad548be61dbf899f72a0" Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.647542 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-dtj5w" Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.664206 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-43f1-account-create-update-bsqmb" event={"ID":"6545ddce-5b65-4702-9dee-2f2d9644123e","Type":"ContainerStarted","Data":"528bbf4e75418d1c76364b9b2f67132ee68b2d967c83d53f51bfc164400ee8dc"} Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.672784 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b1e4-account-create-update-qtmr9" event={"ID":"8da5c3a6-e588-412a-b884-7875fe439e61","Type":"ContainerStarted","Data":"658d7be68d105dc01f8fc7dea1ad05505da4ae53e78a60889ef3189b0c240565"} Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.679928 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-md2wm" event={"ID":"d3a4204b-d91a-4d30-bea2-c327b452b61a","Type":"ContainerStarted","Data":"80e7a592e6e7e74a3b4cfb70f06571bb7d095c2e27cd4d4950ffd10460938212"} Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.682076 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-f47pm" event={"ID":"3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17","Type":"ContainerStarted","Data":"dfd7ece9b83e3449098bc10a01ae36a3ed9423d9ac3e9f3e6dd2c77855881410"} Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.683860 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qjpx9" event={"ID":"077e6634-d42f-4765-ab65-9e24cf21a047","Type":"ContainerStarted","Data":"13d168c727b9d26f6f7317f1e362696e169d6ec9bb3d6175c527decee022cc0f"} Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.683883 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qjpx9" event={"ID":"077e6634-d42f-4765-ab65-9e24cf21a047","Type":"ContainerStarted","Data":"e1397cb47568b144f9e51ff27d3be72176365c20c47cfef59e309117c01af8c3"} Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.699185 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-qjpx9" podStartSLOduration=2.699166943 podStartE2EDuration="2.699166943s" podCreationTimestamp="2026-01-21 14:47:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:47:39.698100624 +0000 UTC m=+1097.606840556" watchObservedRunningTime="2026-01-21 14:47:39.699166943 +0000 UTC m=+1097.607906875" Jan 21 14:47:39 crc kubenswrapper[4720]: E0121 14:47:39.721941 4720 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc40c650e_a05e_4cc0_88fa_d56eae92d29a.slice/crio-cf896b0a6c15a078a1d5ea17910a17801f89b537d091ad548be61dbf899f72a0\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc40c650e_a05e_4cc0_88fa_d56eae92d29a.slice\": RecentStats: unable to find data in memory cache]" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.207761 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-7dx9d"] Jan 21 14:47:40 crc kubenswrapper[4720]: E0121 14:47:40.208469 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c40c650e-a05e-4cc0-88fa-d56eae92d29a" containerName="glance-db-sync" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.208533 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="c40c650e-a05e-4cc0-88fa-d56eae92d29a" containerName="glance-db-sync" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.208765 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="c40c650e-a05e-4cc0-88fa-d56eae92d29a" containerName="glance-db-sync" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.216978 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.244312 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-7dx9d"] Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.311520 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-ovsdbserver-nb\") pod \"dnsmasq-dns-54f9b7b8d9-7dx9d\" (UID: \"e7c8b46b-d758-4538-a345-21ccc71aabe4\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.311627 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v27l\" (UniqueName: \"kubernetes.io/projected/e7c8b46b-d758-4538-a345-21ccc71aabe4-kube-api-access-6v27l\") pod \"dnsmasq-dns-54f9b7b8d9-7dx9d\" (UID: \"e7c8b46b-d758-4538-a345-21ccc71aabe4\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.311680 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-config\") pod \"dnsmasq-dns-54f9b7b8d9-7dx9d\" (UID: \"e7c8b46b-d758-4538-a345-21ccc71aabe4\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.311731 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-dns-svc\") pod \"dnsmasq-dns-54f9b7b8d9-7dx9d\" (UID: \"e7c8b46b-d758-4538-a345-21ccc71aabe4\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.311746 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-ovsdbserver-sb\") pod \"dnsmasq-dns-54f9b7b8d9-7dx9d\" (UID: \"e7c8b46b-d758-4538-a345-21ccc71aabe4\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.413151 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-dns-svc\") pod \"dnsmasq-dns-54f9b7b8d9-7dx9d\" (UID: \"e7c8b46b-d758-4538-a345-21ccc71aabe4\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.413489 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-ovsdbserver-sb\") pod \"dnsmasq-dns-54f9b7b8d9-7dx9d\" (UID: \"e7c8b46b-d758-4538-a345-21ccc71aabe4\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.413547 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-ovsdbserver-nb\") pod \"dnsmasq-dns-54f9b7b8d9-7dx9d\" (UID: \"e7c8b46b-d758-4538-a345-21ccc71aabe4\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.413630 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v27l\" (UniqueName: \"kubernetes.io/projected/e7c8b46b-d758-4538-a345-21ccc71aabe4-kube-api-access-6v27l\") pod \"dnsmasq-dns-54f9b7b8d9-7dx9d\" (UID: \"e7c8b46b-d758-4538-a345-21ccc71aabe4\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.414582 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-dns-svc\") pod \"dnsmasq-dns-54f9b7b8d9-7dx9d\" (UID: \"e7c8b46b-d758-4538-a345-21ccc71aabe4\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.415639 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-ovsdbserver-sb\") pod \"dnsmasq-dns-54f9b7b8d9-7dx9d\" (UID: \"e7c8b46b-d758-4538-a345-21ccc71aabe4\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.418231 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-ovsdbserver-nb\") pod \"dnsmasq-dns-54f9b7b8d9-7dx9d\" (UID: \"e7c8b46b-d758-4538-a345-21ccc71aabe4\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.419280 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-config\") pod \"dnsmasq-dns-54f9b7b8d9-7dx9d\" (UID: \"e7c8b46b-d758-4538-a345-21ccc71aabe4\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.420354 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-config\") pod \"dnsmasq-dns-54f9b7b8d9-7dx9d\" (UID: \"e7c8b46b-d758-4538-a345-21ccc71aabe4\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.470056 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v27l\" (UniqueName: \"kubernetes.io/projected/e7c8b46b-d758-4538-a345-21ccc71aabe4-kube-api-access-6v27l\") pod \"dnsmasq-dns-54f9b7b8d9-7dx9d\" (UID: \"e7c8b46b-d758-4538-a345-21ccc71aabe4\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.543971 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.695195 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b1e4-account-create-update-qtmr9" event={"ID":"8da5c3a6-e588-412a-b884-7875fe439e61","Type":"ContainerStarted","Data":"0acbc31567e50b57eafcd661f7415e473d40a8ea1039c09546c667b2852b3e5b"} Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.697337 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-md2wm" event={"ID":"d3a4204b-d91a-4d30-bea2-c327b452b61a","Type":"ContainerStarted","Data":"307cb2943833035f93ad418790abe5b99a637ac449640923f1bf4d797ef693c9"} Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.708261 4720 generic.go:334] "Generic (PLEG): container finished" podID="077e6634-d42f-4765-ab65-9e24cf21a047" containerID="13d168c727b9d26f6f7317f1e362696e169d6ec9bb3d6175c527decee022cc0f" exitCode=0 Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.708329 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qjpx9" event={"ID":"077e6634-d42f-4765-ab65-9e24cf21a047","Type":"ContainerDied","Data":"13d168c727b9d26f6f7317f1e362696e169d6ec9bb3d6175c527decee022cc0f"} Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.710745 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9105-account-create-update-h4nvp" event={"ID":"82f9f1ca-7fe3-4e17-8393-20364149010d","Type":"ContainerStarted","Data":"5d1f9a2280c4b827ded3a73860cfbf132b529c55e0547c798c884373c0113797"} Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.714171 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-pmrgf" event={"ID":"5ffa29ff-07bd-40cc-9853-a484f79b382f","Type":"ContainerStarted","Data":"b58fbfdd95d5a162cfec3d9e246f4a009ac8953ff289afaa9f7d6970293702c0"} Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.732252 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-43f1-account-create-update-bsqmb" event={"ID":"6545ddce-5b65-4702-9dee-2f2d9644123e","Type":"ContainerStarted","Data":"f16aaabb5619940ea1f57988c30451dc484e4600daff1551a784f8d03b34d96d"} Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.763823 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-b1e4-account-create-update-qtmr9" podStartSLOduration=3.763805322 podStartE2EDuration="3.763805322s" podCreationTimestamp="2026-01-21 14:47:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:47:40.729447896 +0000 UTC m=+1098.638187828" watchObservedRunningTime="2026-01-21 14:47:40.763805322 +0000 UTC m=+1098.672545254" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.795716 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-pmrgf" podStartSLOduration=2.795700792 podStartE2EDuration="2.795700792s" podCreationTimestamp="2026-01-21 14:47:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:47:40.788454084 +0000 UTC m=+1098.697194016" watchObservedRunningTime="2026-01-21 14:47:40.795700792 +0000 UTC m=+1098.704440724" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.812143 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-9105-account-create-update-h4nvp" podStartSLOduration=2.8121297800000002 podStartE2EDuration="2.81212978s" podCreationTimestamp="2026-01-21 14:47:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:47:40.809050496 +0000 UTC m=+1098.717790428" watchObservedRunningTime="2026-01-21 14:47:40.81212978 +0000 UTC m=+1098.720869712" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.839626 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-md2wm" podStartSLOduration=2.8396042 podStartE2EDuration="2.8396042s" podCreationTimestamp="2026-01-21 14:47:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:47:40.82936447 +0000 UTC m=+1098.738104402" watchObservedRunningTime="2026-01-21 14:47:40.8396042 +0000 UTC m=+1098.748344122" Jan 21 14:47:41 crc kubenswrapper[4720]: I0121 14:47:41.182691 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-43f1-account-create-update-bsqmb" podStartSLOduration=3.182641113 podStartE2EDuration="3.182641113s" podCreationTimestamp="2026-01-21 14:47:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:47:40.851881624 +0000 UTC m=+1098.760621556" watchObservedRunningTime="2026-01-21 14:47:41.182641113 +0000 UTC m=+1099.091381045" Jan 21 14:47:41 crc kubenswrapper[4720]: W0121 14:47:41.194844 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7c8b46b_d758_4538_a345_21ccc71aabe4.slice/crio-e1f0107509de082453c41da25f574e86b230a27ade7a48580805b9ac2072e586 WatchSource:0}: Error finding container e1f0107509de082453c41da25f574e86b230a27ade7a48580805b9ac2072e586: Status 404 returned error can't find the container with id e1f0107509de082453c41da25f574e86b230a27ade7a48580805b9ac2072e586 Jan 21 14:47:41 crc kubenswrapper[4720]: I0121 14:47:41.211829 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-7dx9d"] Jan 21 14:47:41 crc kubenswrapper[4720]: I0121 14:47:41.745353 4720 generic.go:334] "Generic (PLEG): container finished" podID="5ffa29ff-07bd-40cc-9853-a484f79b382f" containerID="b58fbfdd95d5a162cfec3d9e246f4a009ac8953ff289afaa9f7d6970293702c0" exitCode=0 Jan 21 14:47:41 crc kubenswrapper[4720]: I0121 14:47:41.745434 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-pmrgf" event={"ID":"5ffa29ff-07bd-40cc-9853-a484f79b382f","Type":"ContainerDied","Data":"b58fbfdd95d5a162cfec3d9e246f4a009ac8953ff289afaa9f7d6970293702c0"} Jan 21 14:47:41 crc kubenswrapper[4720]: I0121 14:47:41.747127 4720 generic.go:334] "Generic (PLEG): container finished" podID="6545ddce-5b65-4702-9dee-2f2d9644123e" containerID="f16aaabb5619940ea1f57988c30451dc484e4600daff1551a784f8d03b34d96d" exitCode=0 Jan 21 14:47:41 crc kubenswrapper[4720]: I0121 14:47:41.747170 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-43f1-account-create-update-bsqmb" event={"ID":"6545ddce-5b65-4702-9dee-2f2d9644123e","Type":"ContainerDied","Data":"f16aaabb5619940ea1f57988c30451dc484e4600daff1551a784f8d03b34d96d"} Jan 21 14:47:41 crc kubenswrapper[4720]: I0121 14:47:41.749384 4720 generic.go:334] "Generic (PLEG): container finished" podID="8da5c3a6-e588-412a-b884-7875fe439e61" containerID="0acbc31567e50b57eafcd661f7415e473d40a8ea1039c09546c667b2852b3e5b" exitCode=0 Jan 21 14:47:41 crc kubenswrapper[4720]: I0121 14:47:41.749452 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b1e4-account-create-update-qtmr9" event={"ID":"8da5c3a6-e588-412a-b884-7875fe439e61","Type":"ContainerDied","Data":"0acbc31567e50b57eafcd661f7415e473d40a8ea1039c09546c667b2852b3e5b"} Jan 21 14:47:41 crc kubenswrapper[4720]: I0121 14:47:41.751007 4720 generic.go:334] "Generic (PLEG): container finished" podID="d3a4204b-d91a-4d30-bea2-c327b452b61a" containerID="307cb2943833035f93ad418790abe5b99a637ac449640923f1bf4d797ef693c9" exitCode=0 Jan 21 14:47:41 crc kubenswrapper[4720]: I0121 14:47:41.751062 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-md2wm" event={"ID":"d3a4204b-d91a-4d30-bea2-c327b452b61a","Type":"ContainerDied","Data":"307cb2943833035f93ad418790abe5b99a637ac449640923f1bf4d797ef693c9"} Jan 21 14:47:41 crc kubenswrapper[4720]: I0121 14:47:41.752384 4720 generic.go:334] "Generic (PLEG): container finished" podID="82f9f1ca-7fe3-4e17-8393-20364149010d" containerID="5d1f9a2280c4b827ded3a73860cfbf132b529c55e0547c798c884373c0113797" exitCode=0 Jan 21 14:47:41 crc kubenswrapper[4720]: I0121 14:47:41.752431 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9105-account-create-update-h4nvp" event={"ID":"82f9f1ca-7fe3-4e17-8393-20364149010d","Type":"ContainerDied","Data":"5d1f9a2280c4b827ded3a73860cfbf132b529c55e0547c798c884373c0113797"} Jan 21 14:47:41 crc kubenswrapper[4720]: I0121 14:47:41.753849 4720 generic.go:334] "Generic (PLEG): container finished" podID="e7c8b46b-d758-4538-a345-21ccc71aabe4" containerID="856c89063d4a8ffd0dea2ed8327f087d5552a6933c06f250049624af9b370e87" exitCode=0 Jan 21 14:47:41 crc kubenswrapper[4720]: I0121 14:47:41.753882 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" event={"ID":"e7c8b46b-d758-4538-a345-21ccc71aabe4","Type":"ContainerDied","Data":"856c89063d4a8ffd0dea2ed8327f087d5552a6933c06f250049624af9b370e87"} Jan 21 14:47:41 crc kubenswrapper[4720]: I0121 14:47:41.753905 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" event={"ID":"e7c8b46b-d758-4538-a345-21ccc71aabe4","Type":"ContainerStarted","Data":"e1f0107509de082453c41da25f574e86b230a27ade7a48580805b9ac2072e586"} Jan 21 14:47:42 crc kubenswrapper[4720]: I0121 14:47:42.160492 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qjpx9" Jan 21 14:47:42 crc kubenswrapper[4720]: I0121 14:47:42.266752 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvmlq\" (UniqueName: \"kubernetes.io/projected/077e6634-d42f-4765-ab65-9e24cf21a047-kube-api-access-tvmlq\") pod \"077e6634-d42f-4765-ab65-9e24cf21a047\" (UID: \"077e6634-d42f-4765-ab65-9e24cf21a047\") " Jan 21 14:47:42 crc kubenswrapper[4720]: I0121 14:47:42.266873 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/077e6634-d42f-4765-ab65-9e24cf21a047-operator-scripts\") pod \"077e6634-d42f-4765-ab65-9e24cf21a047\" (UID: \"077e6634-d42f-4765-ab65-9e24cf21a047\") " Jan 21 14:47:42 crc kubenswrapper[4720]: I0121 14:47:42.268031 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/077e6634-d42f-4765-ab65-9e24cf21a047-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "077e6634-d42f-4765-ab65-9e24cf21a047" (UID: "077e6634-d42f-4765-ab65-9e24cf21a047"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:42 crc kubenswrapper[4720]: I0121 14:47:42.273199 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/077e6634-d42f-4765-ab65-9e24cf21a047-kube-api-access-tvmlq" (OuterVolumeSpecName: "kube-api-access-tvmlq") pod "077e6634-d42f-4765-ab65-9e24cf21a047" (UID: "077e6634-d42f-4765-ab65-9e24cf21a047"). InnerVolumeSpecName "kube-api-access-tvmlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:47:42 crc kubenswrapper[4720]: I0121 14:47:42.368363 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvmlq\" (UniqueName: \"kubernetes.io/projected/077e6634-d42f-4765-ab65-9e24cf21a047-kube-api-access-tvmlq\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:42 crc kubenswrapper[4720]: I0121 14:47:42.368384 4720 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/077e6634-d42f-4765-ab65-9e24cf21a047-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:42 crc kubenswrapper[4720]: I0121 14:47:42.766019 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" event={"ID":"e7c8b46b-d758-4538-a345-21ccc71aabe4","Type":"ContainerStarted","Data":"e0a764a335a04c77ad3eddd776f3c96760f97db3ae9ca0277c3d3c8d73645666"} Jan 21 14:47:42 crc kubenswrapper[4720]: I0121 14:47:42.766149 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:47:42 crc kubenswrapper[4720]: I0121 14:47:42.771704 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qjpx9" Jan 21 14:47:42 crc kubenswrapper[4720]: I0121 14:47:42.772405 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qjpx9" event={"ID":"077e6634-d42f-4765-ab65-9e24cf21a047","Type":"ContainerDied","Data":"e1397cb47568b144f9e51ff27d3be72176365c20c47cfef59e309117c01af8c3"} Jan 21 14:47:42 crc kubenswrapper[4720]: I0121 14:47:42.772433 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1397cb47568b144f9e51ff27d3be72176365c20c47cfef59e309117c01af8c3" Jan 21 14:47:42 crc kubenswrapper[4720]: I0121 14:47:42.801823 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" podStartSLOduration=2.801805472 podStartE2EDuration="2.801805472s" podCreationTimestamp="2026-01-21 14:47:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:47:42.794631657 +0000 UTC m=+1100.703371589" watchObservedRunningTime="2026-01-21 14:47:42.801805472 +0000 UTC m=+1100.710545404" Jan 21 14:47:46 crc kubenswrapper[4720]: I0121 14:47:46.810804 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b1e4-account-create-update-qtmr9" event={"ID":"8da5c3a6-e588-412a-b884-7875fe439e61","Type":"ContainerDied","Data":"658d7be68d105dc01f8fc7dea1ad05505da4ae53e78a60889ef3189b0c240565"} Jan 21 14:47:46 crc kubenswrapper[4720]: I0121 14:47:46.811439 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="658d7be68d105dc01f8fc7dea1ad05505da4ae53e78a60889ef3189b0c240565" Jan 21 14:47:46 crc kubenswrapper[4720]: I0121 14:47:46.812554 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-md2wm" event={"ID":"d3a4204b-d91a-4d30-bea2-c327b452b61a","Type":"ContainerDied","Data":"80e7a592e6e7e74a3b4cfb70f06571bb7d095c2e27cd4d4950ffd10460938212"} Jan 21 14:47:46 crc kubenswrapper[4720]: I0121 14:47:46.812576 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80e7a592e6e7e74a3b4cfb70f06571bb7d095c2e27cd4d4950ffd10460938212" Jan 21 14:47:46 crc kubenswrapper[4720]: I0121 14:47:46.814948 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9105-account-create-update-h4nvp" event={"ID":"82f9f1ca-7fe3-4e17-8393-20364149010d","Type":"ContainerDied","Data":"44b9f91c267339bfc8b0884fc8c5c2ef790947f87b24723a7a09d88ebc795b96"} Jan 21 14:47:46 crc kubenswrapper[4720]: I0121 14:47:46.814975 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44b9f91c267339bfc8b0884fc8c5c2ef790947f87b24723a7a09d88ebc795b96" Jan 21 14:47:46 crc kubenswrapper[4720]: I0121 14:47:46.816999 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-pmrgf" event={"ID":"5ffa29ff-07bd-40cc-9853-a484f79b382f","Type":"ContainerDied","Data":"a8464f3faefc31f591d60cd9bb491ee725d1832097b4b460ab3f30832c1f9786"} Jan 21 14:47:46 crc kubenswrapper[4720]: I0121 14:47:46.817037 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8464f3faefc31f591d60cd9bb491ee725d1832097b4b460ab3f30832c1f9786" Jan 21 14:47:46 crc kubenswrapper[4720]: I0121 14:47:46.827894 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-43f1-account-create-update-bsqmb" event={"ID":"6545ddce-5b65-4702-9dee-2f2d9644123e","Type":"ContainerDied","Data":"528bbf4e75418d1c76364b9b2f67132ee68b2d967c83d53f51bfc164400ee8dc"} Jan 21 14:47:46 crc kubenswrapper[4720]: I0121 14:47:46.827955 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="528bbf4e75418d1c76364b9b2f67132ee68b2d967c83d53f51bfc164400ee8dc" Jan 21 14:47:46 crc kubenswrapper[4720]: I0121 14:47:46.922089 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-43f1-account-create-update-bsqmb" Jan 21 14:47:46 crc kubenswrapper[4720]: I0121 14:47:46.953643 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9105-account-create-update-h4nvp" Jan 21 14:47:46 crc kubenswrapper[4720]: I0121 14:47:46.980497 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b1e4-account-create-update-qtmr9" Jan 21 14:47:46 crc kubenswrapper[4720]: I0121 14:47:46.989184 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-md2wm" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.010673 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-pmrgf" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.043154 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqgdt\" (UniqueName: \"kubernetes.io/projected/82f9f1ca-7fe3-4e17-8393-20364149010d-kube-api-access-fqgdt\") pod \"82f9f1ca-7fe3-4e17-8393-20364149010d\" (UID: \"82f9f1ca-7fe3-4e17-8393-20364149010d\") " Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.043196 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8j6rk\" (UniqueName: \"kubernetes.io/projected/8da5c3a6-e588-412a-b884-7875fe439e61-kube-api-access-8j6rk\") pod \"8da5c3a6-e588-412a-b884-7875fe439e61\" (UID: \"8da5c3a6-e588-412a-b884-7875fe439e61\") " Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.043247 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82f9f1ca-7fe3-4e17-8393-20364149010d-operator-scripts\") pod \"82f9f1ca-7fe3-4e17-8393-20364149010d\" (UID: \"82f9f1ca-7fe3-4e17-8393-20364149010d\") " Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.043271 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tvcs\" (UniqueName: \"kubernetes.io/projected/d3a4204b-d91a-4d30-bea2-c327b452b61a-kube-api-access-2tvcs\") pod \"d3a4204b-d91a-4d30-bea2-c327b452b61a\" (UID: \"d3a4204b-d91a-4d30-bea2-c327b452b61a\") " Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.043299 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3a4204b-d91a-4d30-bea2-c327b452b61a-operator-scripts\") pod \"d3a4204b-d91a-4d30-bea2-c327b452b61a\" (UID: \"d3a4204b-d91a-4d30-bea2-c327b452b61a\") " Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.043349 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8da5c3a6-e588-412a-b884-7875fe439e61-operator-scripts\") pod \"8da5c3a6-e588-412a-b884-7875fe439e61\" (UID: \"8da5c3a6-e588-412a-b884-7875fe439e61\") " Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.043387 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lmjm\" (UniqueName: \"kubernetes.io/projected/5ffa29ff-07bd-40cc-9853-a484f79b382f-kube-api-access-7lmjm\") pod \"5ffa29ff-07bd-40cc-9853-a484f79b382f\" (UID: \"5ffa29ff-07bd-40cc-9853-a484f79b382f\") " Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.043461 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ffa29ff-07bd-40cc-9853-a484f79b382f-operator-scripts\") pod \"5ffa29ff-07bd-40cc-9853-a484f79b382f\" (UID: \"5ffa29ff-07bd-40cc-9853-a484f79b382f\") " Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.043492 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6545ddce-5b65-4702-9dee-2f2d9644123e-operator-scripts\") pod \"6545ddce-5b65-4702-9dee-2f2d9644123e\" (UID: \"6545ddce-5b65-4702-9dee-2f2d9644123e\") " Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.043532 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2246\" (UniqueName: \"kubernetes.io/projected/6545ddce-5b65-4702-9dee-2f2d9644123e-kube-api-access-n2246\") pod \"6545ddce-5b65-4702-9dee-2f2d9644123e\" (UID: \"6545ddce-5b65-4702-9dee-2f2d9644123e\") " Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.044992 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3a4204b-d91a-4d30-bea2-c327b452b61a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d3a4204b-d91a-4d30-bea2-c327b452b61a" (UID: "d3a4204b-d91a-4d30-bea2-c327b452b61a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.048437 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8da5c3a6-e588-412a-b884-7875fe439e61-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8da5c3a6-e588-412a-b884-7875fe439e61" (UID: "8da5c3a6-e588-412a-b884-7875fe439e61"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.048869 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ffa29ff-07bd-40cc-9853-a484f79b382f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5ffa29ff-07bd-40cc-9853-a484f79b382f" (UID: "5ffa29ff-07bd-40cc-9853-a484f79b382f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.049294 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82f9f1ca-7fe3-4e17-8393-20364149010d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "82f9f1ca-7fe3-4e17-8393-20364149010d" (UID: "82f9f1ca-7fe3-4e17-8393-20364149010d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.049394 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6545ddce-5b65-4702-9dee-2f2d9644123e-kube-api-access-n2246" (OuterVolumeSpecName: "kube-api-access-n2246") pod "6545ddce-5b65-4702-9dee-2f2d9644123e" (UID: "6545ddce-5b65-4702-9dee-2f2d9644123e"). InnerVolumeSpecName "kube-api-access-n2246". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.049409 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6545ddce-5b65-4702-9dee-2f2d9644123e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6545ddce-5b65-4702-9dee-2f2d9644123e" (UID: "6545ddce-5b65-4702-9dee-2f2d9644123e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.051477 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82f9f1ca-7fe3-4e17-8393-20364149010d-kube-api-access-fqgdt" (OuterVolumeSpecName: "kube-api-access-fqgdt") pod "82f9f1ca-7fe3-4e17-8393-20364149010d" (UID: "82f9f1ca-7fe3-4e17-8393-20364149010d"). InnerVolumeSpecName "kube-api-access-fqgdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.051920 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8da5c3a6-e588-412a-b884-7875fe439e61-kube-api-access-8j6rk" (OuterVolumeSpecName: "kube-api-access-8j6rk") pod "8da5c3a6-e588-412a-b884-7875fe439e61" (UID: "8da5c3a6-e588-412a-b884-7875fe439e61"). InnerVolumeSpecName "kube-api-access-8j6rk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.052401 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3a4204b-d91a-4d30-bea2-c327b452b61a-kube-api-access-2tvcs" (OuterVolumeSpecName: "kube-api-access-2tvcs") pod "d3a4204b-d91a-4d30-bea2-c327b452b61a" (UID: "d3a4204b-d91a-4d30-bea2-c327b452b61a"). InnerVolumeSpecName "kube-api-access-2tvcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.052844 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ffa29ff-07bd-40cc-9853-a484f79b382f-kube-api-access-7lmjm" (OuterVolumeSpecName: "kube-api-access-7lmjm") pod "5ffa29ff-07bd-40cc-9853-a484f79b382f" (UID: "5ffa29ff-07bd-40cc-9853-a484f79b382f"). InnerVolumeSpecName "kube-api-access-7lmjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.144770 4720 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ffa29ff-07bd-40cc-9853-a484f79b382f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.144808 4720 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6545ddce-5b65-4702-9dee-2f2d9644123e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.144844 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2246\" (UniqueName: \"kubernetes.io/projected/6545ddce-5b65-4702-9dee-2f2d9644123e-kube-api-access-n2246\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.144861 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqgdt\" (UniqueName: \"kubernetes.io/projected/82f9f1ca-7fe3-4e17-8393-20364149010d-kube-api-access-fqgdt\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.144875 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8j6rk\" (UniqueName: \"kubernetes.io/projected/8da5c3a6-e588-412a-b884-7875fe439e61-kube-api-access-8j6rk\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.144886 4720 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82f9f1ca-7fe3-4e17-8393-20364149010d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.144921 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tvcs\" (UniqueName: \"kubernetes.io/projected/d3a4204b-d91a-4d30-bea2-c327b452b61a-kube-api-access-2tvcs\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.144934 4720 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3a4204b-d91a-4d30-bea2-c327b452b61a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.144944 4720 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8da5c3a6-e588-412a-b884-7875fe439e61-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.144956 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lmjm\" (UniqueName: \"kubernetes.io/projected/5ffa29ff-07bd-40cc-9853-a484f79b382f-kube-api-access-7lmjm\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.839105 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-43f1-account-create-update-bsqmb" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.839127 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-md2wm" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.839147 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-f47pm" event={"ID":"3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17","Type":"ContainerStarted","Data":"77c8d16617de72e209afb71532a20278f4f6ca3c8ddea5a94d98282960f81a1c"} Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.839154 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9105-account-create-update-h4nvp" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.839181 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-pmrgf" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.839189 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b1e4-account-create-update-qtmr9" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.953404 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-f47pm" podStartSLOduration=2.662609213 podStartE2EDuration="9.953379589s" podCreationTimestamp="2026-01-21 14:47:38 +0000 UTC" firstStartedPulling="2026-01-21 14:47:39.477377456 +0000 UTC m=+1097.386117388" lastFinishedPulling="2026-01-21 14:47:46.768147822 +0000 UTC m=+1104.676887764" observedRunningTime="2026-01-21 14:47:47.862578275 +0000 UTC m=+1105.771318227" watchObservedRunningTime="2026-01-21 14:47:47.953379589 +0000 UTC m=+1105.862119531" Jan 21 14:47:50 crc kubenswrapper[4720]: I0121 14:47:50.546852 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:47:50 crc kubenswrapper[4720]: I0121 14:47:50.633128 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-r6bj4"] Jan 21 14:47:50 crc kubenswrapper[4720]: I0121 14:47:50.633631 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" podUID="c819d03b-78e1-470e-96dc-6144aa8e8f5a" containerName="dnsmasq-dns" containerID="cri-o://2833ba66e8cdea183ed0eb3bc5ec831775fbf6f4d81eeb54b6a98664af721cfc" gracePeriod=10 Jan 21 14:47:50 crc kubenswrapper[4720]: I0121 14:47:50.865257 4720 generic.go:334] "Generic (PLEG): container finished" podID="c819d03b-78e1-470e-96dc-6144aa8e8f5a" containerID="2833ba66e8cdea183ed0eb3bc5ec831775fbf6f4d81eeb54b6a98664af721cfc" exitCode=0 Jan 21 14:47:50 crc kubenswrapper[4720]: I0121 14:47:50.865455 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" event={"ID":"c819d03b-78e1-470e-96dc-6144aa8e8f5a","Type":"ContainerDied","Data":"2833ba66e8cdea183ed0eb3bc5ec831775fbf6f4d81eeb54b6a98664af721cfc"} Jan 21 14:47:50 crc kubenswrapper[4720]: I0121 14:47:50.868323 4720 generic.go:334] "Generic (PLEG): container finished" podID="3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17" containerID="77c8d16617de72e209afb71532a20278f4f6ca3c8ddea5a94d98282960f81a1c" exitCode=0 Jan 21 14:47:50 crc kubenswrapper[4720]: I0121 14:47:50.868353 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-f47pm" event={"ID":"3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17","Type":"ContainerDied","Data":"77c8d16617de72e209afb71532a20278f4f6ca3c8ddea5a94d98282960f81a1c"} Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.069945 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.227813 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-ovsdbserver-sb\") pod \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\" (UID: \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\") " Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.227861 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-dns-svc\") pod \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\" (UID: \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\") " Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.227902 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-ovsdbserver-nb\") pod \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\" (UID: \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\") " Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.227995 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-config\") pod \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\" (UID: \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\") " Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.228026 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9bw8\" (UniqueName: \"kubernetes.io/projected/c819d03b-78e1-470e-96dc-6144aa8e8f5a-kube-api-access-w9bw8\") pod \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\" (UID: \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\") " Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.240855 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c819d03b-78e1-470e-96dc-6144aa8e8f5a-kube-api-access-w9bw8" (OuterVolumeSpecName: "kube-api-access-w9bw8") pod "c819d03b-78e1-470e-96dc-6144aa8e8f5a" (UID: "c819d03b-78e1-470e-96dc-6144aa8e8f5a"). InnerVolumeSpecName "kube-api-access-w9bw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.284751 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-config" (OuterVolumeSpecName: "config") pod "c819d03b-78e1-470e-96dc-6144aa8e8f5a" (UID: "c819d03b-78e1-470e-96dc-6144aa8e8f5a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.285050 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c819d03b-78e1-470e-96dc-6144aa8e8f5a" (UID: "c819d03b-78e1-470e-96dc-6144aa8e8f5a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.286510 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c819d03b-78e1-470e-96dc-6144aa8e8f5a" (UID: "c819d03b-78e1-470e-96dc-6144aa8e8f5a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.333403 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.333475 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9bw8\" (UniqueName: \"kubernetes.io/projected/c819d03b-78e1-470e-96dc-6144aa8e8f5a-kube-api-access-w9bw8\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.333493 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.333542 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.341419 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c819d03b-78e1-470e-96dc-6144aa8e8f5a" (UID: "c819d03b-78e1-470e-96dc-6144aa8e8f5a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.435014 4720 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.880681 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" event={"ID":"c819d03b-78e1-470e-96dc-6144aa8e8f5a","Type":"ContainerDied","Data":"796d2687528fd87d25dd4fc1a5f89808d76b284cc0b9360ef63068e7663548e8"} Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.880738 4720 scope.go:117] "RemoveContainer" containerID="2833ba66e8cdea183ed0eb3bc5ec831775fbf6f4d81eeb54b6a98664af721cfc" Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.882271 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.934210 4720 scope.go:117] "RemoveContainer" containerID="ce16ebb9a67a679cad4040701c2e535eabfd75f649979c91f4ea8e8bc1b64f6b" Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.943858 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-r6bj4"] Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.952528 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-r6bj4"] Jan 21 14:47:52 crc kubenswrapper[4720]: I0121 14:47:52.238095 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-f47pm" Jan 21 14:47:52 crc kubenswrapper[4720]: I0121 14:47:52.350640 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17-combined-ca-bundle\") pod \"3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17\" (UID: \"3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17\") " Jan 21 14:47:52 crc kubenswrapper[4720]: I0121 14:47:52.350722 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnhtt\" (UniqueName: \"kubernetes.io/projected/3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17-kube-api-access-fnhtt\") pod \"3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17\" (UID: \"3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17\") " Jan 21 14:47:52 crc kubenswrapper[4720]: I0121 14:47:52.350905 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17-config-data\") pod \"3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17\" (UID: \"3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17\") " Jan 21 14:47:52 crc kubenswrapper[4720]: I0121 14:47:52.356993 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17-kube-api-access-fnhtt" (OuterVolumeSpecName: "kube-api-access-fnhtt") pod "3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17" (UID: "3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17"). InnerVolumeSpecName "kube-api-access-fnhtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:47:52 crc kubenswrapper[4720]: I0121 14:47:52.378165 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17" (UID: "3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:47:52 crc kubenswrapper[4720]: I0121 14:47:52.405290 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17-config-data" (OuterVolumeSpecName: "config-data") pod "3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17" (UID: "3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:47:52 crc kubenswrapper[4720]: I0121 14:47:52.452892 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnhtt\" (UniqueName: \"kubernetes.io/projected/3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17-kube-api-access-fnhtt\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:52 crc kubenswrapper[4720]: I0121 14:47:52.452950 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:52 crc kubenswrapper[4720]: I0121 14:47:52.452964 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:52 crc kubenswrapper[4720]: I0121 14:47:52.689596 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c819d03b-78e1-470e-96dc-6144aa8e8f5a" path="/var/lib/kubelet/pods/c819d03b-78e1-470e-96dc-6144aa8e8f5a/volumes" Jan 21 14:47:52 crc kubenswrapper[4720]: I0121 14:47:52.891839 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-f47pm" event={"ID":"3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17","Type":"ContainerDied","Data":"dfd7ece9b83e3449098bc10a01ae36a3ed9423d9ac3e9f3e6dd2c77855881410"} Jan 21 14:47:52 crc kubenswrapper[4720]: I0121 14:47:52.891895 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfd7ece9b83e3449098bc10a01ae36a3ed9423d9ac3e9f3e6dd2c77855881410" Jan 21 14:47:52 crc kubenswrapper[4720]: I0121 14:47:52.891972 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-f47pm" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.146800 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-swpsq"] Jan 21 14:47:53 crc kubenswrapper[4720]: E0121 14:47:53.147348 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17" containerName="keystone-db-sync" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.147366 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17" containerName="keystone-db-sync" Jan 21 14:47:53 crc kubenswrapper[4720]: E0121 14:47:53.147392 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c819d03b-78e1-470e-96dc-6144aa8e8f5a" containerName="dnsmasq-dns" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.147399 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="c819d03b-78e1-470e-96dc-6144aa8e8f5a" containerName="dnsmasq-dns" Jan 21 14:47:53 crc kubenswrapper[4720]: E0121 14:47:53.147412 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077e6634-d42f-4765-ab65-9e24cf21a047" containerName="mariadb-database-create" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.147418 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="077e6634-d42f-4765-ab65-9e24cf21a047" containerName="mariadb-database-create" Jan 21 14:47:53 crc kubenswrapper[4720]: E0121 14:47:53.147427 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8da5c3a6-e588-412a-b884-7875fe439e61" containerName="mariadb-account-create-update" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.147432 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="8da5c3a6-e588-412a-b884-7875fe439e61" containerName="mariadb-account-create-update" Jan 21 14:47:53 crc kubenswrapper[4720]: E0121 14:47:53.147442 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3a4204b-d91a-4d30-bea2-c327b452b61a" containerName="mariadb-database-create" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.147447 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3a4204b-d91a-4d30-bea2-c327b452b61a" containerName="mariadb-database-create" Jan 21 14:47:53 crc kubenswrapper[4720]: E0121 14:47:53.147457 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ffa29ff-07bd-40cc-9853-a484f79b382f" containerName="mariadb-database-create" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.147462 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ffa29ff-07bd-40cc-9853-a484f79b382f" containerName="mariadb-database-create" Jan 21 14:47:53 crc kubenswrapper[4720]: E0121 14:47:53.147471 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c819d03b-78e1-470e-96dc-6144aa8e8f5a" containerName="init" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.147476 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="c819d03b-78e1-470e-96dc-6144aa8e8f5a" containerName="init" Jan 21 14:47:53 crc kubenswrapper[4720]: E0121 14:47:53.147489 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6545ddce-5b65-4702-9dee-2f2d9644123e" containerName="mariadb-account-create-update" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.147495 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="6545ddce-5b65-4702-9dee-2f2d9644123e" containerName="mariadb-account-create-update" Jan 21 14:47:53 crc kubenswrapper[4720]: E0121 14:47:53.147507 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82f9f1ca-7fe3-4e17-8393-20364149010d" containerName="mariadb-account-create-update" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.147513 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="82f9f1ca-7fe3-4e17-8393-20364149010d" containerName="mariadb-account-create-update" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.147639 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17" containerName="keystone-db-sync" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.149736 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="077e6634-d42f-4765-ab65-9e24cf21a047" containerName="mariadb-database-create" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.149768 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="c819d03b-78e1-470e-96dc-6144aa8e8f5a" containerName="dnsmasq-dns" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.149779 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ffa29ff-07bd-40cc-9853-a484f79b382f" containerName="mariadb-database-create" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.149792 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="82f9f1ca-7fe3-4e17-8393-20364149010d" containerName="mariadb-account-create-update" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.149801 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3a4204b-d91a-4d30-bea2-c327b452b61a" containerName="mariadb-database-create" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.149811 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="6545ddce-5b65-4702-9dee-2f2d9644123e" containerName="mariadb-account-create-update" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.149833 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="8da5c3a6-e588-412a-b884-7875fe439e61" containerName="mariadb-account-create-update" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.150793 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-swpsq" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.166012 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-swpsq"] Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.170644 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-ovsdbserver-sb\") pod \"dnsmasq-dns-6546db6db7-swpsq\" (UID: \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\") " pod="openstack/dnsmasq-dns-6546db6db7-swpsq" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.170766 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-config\") pod \"dnsmasq-dns-6546db6db7-swpsq\" (UID: \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\") " pod="openstack/dnsmasq-dns-6546db6db7-swpsq" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.170787 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lpjd\" (UniqueName: \"kubernetes.io/projected/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-kube-api-access-5lpjd\") pod \"dnsmasq-dns-6546db6db7-swpsq\" (UID: \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\") " pod="openstack/dnsmasq-dns-6546db6db7-swpsq" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.170837 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-ovsdbserver-nb\") pod \"dnsmasq-dns-6546db6db7-swpsq\" (UID: \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\") " pod="openstack/dnsmasq-dns-6546db6db7-swpsq" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.170871 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-dns-svc\") pod \"dnsmasq-dns-6546db6db7-swpsq\" (UID: \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\") " pod="openstack/dnsmasq-dns-6546db6db7-swpsq" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.218339 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-sbng7"] Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.220698 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.229165 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.229390 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.229500 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pb4wb" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.229616 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.229762 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.253268 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-sbng7"] Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.272396 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbvd8\" (UniqueName: \"kubernetes.io/projected/2db18bae-9cc2-4e10-b04c-edd0bb539b79-kube-api-access-sbvd8\") pod \"keystone-bootstrap-sbng7\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.272442 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-ovsdbserver-nb\") pod \"dnsmasq-dns-6546db6db7-swpsq\" (UID: \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\") " pod="openstack/dnsmasq-dns-6546db6db7-swpsq" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.272478 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-dns-svc\") pod \"dnsmasq-dns-6546db6db7-swpsq\" (UID: \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\") " pod="openstack/dnsmasq-dns-6546db6db7-swpsq" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.272500 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-fernet-keys\") pod \"keystone-bootstrap-sbng7\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.272525 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-scripts\") pod \"keystone-bootstrap-sbng7\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.272545 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-config-data\") pod \"keystone-bootstrap-sbng7\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.272595 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-credential-keys\") pod \"keystone-bootstrap-sbng7\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.272776 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-ovsdbserver-sb\") pod \"dnsmasq-dns-6546db6db7-swpsq\" (UID: \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\") " pod="openstack/dnsmasq-dns-6546db6db7-swpsq" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.272826 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-config\") pod \"dnsmasq-dns-6546db6db7-swpsq\" (UID: \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\") " pod="openstack/dnsmasq-dns-6546db6db7-swpsq" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.272854 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lpjd\" (UniqueName: \"kubernetes.io/projected/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-kube-api-access-5lpjd\") pod \"dnsmasq-dns-6546db6db7-swpsq\" (UID: \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\") " pod="openstack/dnsmasq-dns-6546db6db7-swpsq" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.272888 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-combined-ca-bundle\") pod \"keystone-bootstrap-sbng7\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.273643 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-dns-svc\") pod \"dnsmasq-dns-6546db6db7-swpsq\" (UID: \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\") " pod="openstack/dnsmasq-dns-6546db6db7-swpsq" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.273832 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-ovsdbserver-sb\") pod \"dnsmasq-dns-6546db6db7-swpsq\" (UID: \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\") " pod="openstack/dnsmasq-dns-6546db6db7-swpsq" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.275735 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-config\") pod \"dnsmasq-dns-6546db6db7-swpsq\" (UID: \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\") " pod="openstack/dnsmasq-dns-6546db6db7-swpsq" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.276268 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-ovsdbserver-nb\") pod \"dnsmasq-dns-6546db6db7-swpsq\" (UID: \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\") " pod="openstack/dnsmasq-dns-6546db6db7-swpsq" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.329307 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lpjd\" (UniqueName: \"kubernetes.io/projected/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-kube-api-access-5lpjd\") pod \"dnsmasq-dns-6546db6db7-swpsq\" (UID: \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\") " pod="openstack/dnsmasq-dns-6546db6db7-swpsq" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.373548 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-combined-ca-bundle\") pod \"keystone-bootstrap-sbng7\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.373602 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbvd8\" (UniqueName: \"kubernetes.io/projected/2db18bae-9cc2-4e10-b04c-edd0bb539b79-kube-api-access-sbvd8\") pod \"keystone-bootstrap-sbng7\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.373632 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-fernet-keys\") pod \"keystone-bootstrap-sbng7\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.373667 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-scripts\") pod \"keystone-bootstrap-sbng7\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.373682 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-config-data\") pod \"keystone-bootstrap-sbng7\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.373721 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-credential-keys\") pod \"keystone-bootstrap-sbng7\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.381496 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-credential-keys\") pod \"keystone-bootstrap-sbng7\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.384861 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-config-data\") pod \"keystone-bootstrap-sbng7\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.385115 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-combined-ca-bundle\") pod \"keystone-bootstrap-sbng7\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.390996 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-scripts\") pod \"keystone-bootstrap-sbng7\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.394028 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-fernet-keys\") pod \"keystone-bootstrap-sbng7\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.399274 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbvd8\" (UniqueName: \"kubernetes.io/projected/2db18bae-9cc2-4e10-b04c-edd0bb539b79-kube-api-access-sbvd8\") pod \"keystone-bootstrap-sbng7\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.466265 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-swpsq" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.501380 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.509020 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.516430 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.519134 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.555721 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.559482 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.568307 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-wtr5d"] Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.569268 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wtr5d" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.579012 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-9pjf9" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.590067 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.637573 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-wtr5d"] Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.676951 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-fhvrr"] Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.678617 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fhvrr" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.680695 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.680737 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.680756 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2eaf7930-34cf-4396-9b94-c09d3a5da09a-db-sync-config-data\") pod \"barbican-db-sync-wtr5d\" (UID: \"2eaf7930-34cf-4396-9b94-c09d3a5da09a\") " pod="openstack/barbican-db-sync-wtr5d" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.680805 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-scripts\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.680824 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqjpm\" (UniqueName: \"kubernetes.io/projected/2eaf7930-34cf-4396-9b94-c09d3a5da09a-kube-api-access-sqjpm\") pod \"barbican-db-sync-wtr5d\" (UID: \"2eaf7930-34cf-4396-9b94-c09d3a5da09a\") " pod="openstack/barbican-db-sync-wtr5d" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.680851 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6x4j\" (UniqueName: \"kubernetes.io/projected/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-kube-api-access-b6x4j\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.680877 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-config-data\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.680910 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-run-httpd\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.680929 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eaf7930-34cf-4396-9b94-c09d3a5da09a-combined-ca-bundle\") pod \"barbican-db-sync-wtr5d\" (UID: \"2eaf7930-34cf-4396-9b94-c09d3a5da09a\") " pod="openstack/barbican-db-sync-wtr5d" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.680944 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-log-httpd\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.684140 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.684321 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.684521 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-r779p" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.692523 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-fhvrr"] Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.722174 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-vz5k2"] Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.723060 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.726774 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-r7487" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.726960 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.728247 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.761714 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-vz5k2"] Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.783562 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-scripts\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.783597 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqjpm\" (UniqueName: \"kubernetes.io/projected/2eaf7930-34cf-4396-9b94-c09d3a5da09a-kube-api-access-sqjpm\") pod \"barbican-db-sync-wtr5d\" (UID: \"2eaf7930-34cf-4396-9b94-c09d3a5da09a\") " pod="openstack/barbican-db-sync-wtr5d" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.783628 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6x4j\" (UniqueName: \"kubernetes.io/projected/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-kube-api-access-b6x4j\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.783671 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-config-data\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.783701 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb2pq\" (UniqueName: \"kubernetes.io/projected/7a6c6de6-8f88-4c87-bd8e-46579996948e-kube-api-access-fb2pq\") pod \"neutron-db-sync-fhvrr\" (UID: \"7a6c6de6-8f88-4c87-bd8e-46579996948e\") " pod="openstack/neutron-db-sync-fhvrr" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.783724 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-run-httpd\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.783744 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eaf7930-34cf-4396-9b94-c09d3a5da09a-combined-ca-bundle\") pod \"barbican-db-sync-wtr5d\" (UID: \"2eaf7930-34cf-4396-9b94-c09d3a5da09a\") " pod="openstack/barbican-db-sync-wtr5d" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.783759 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-log-httpd\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.783782 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a6c6de6-8f88-4c87-bd8e-46579996948e-combined-ca-bundle\") pod \"neutron-db-sync-fhvrr\" (UID: \"7a6c6de6-8f88-4c87-bd8e-46579996948e\") " pod="openstack/neutron-db-sync-fhvrr" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.783804 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7a6c6de6-8f88-4c87-bd8e-46579996948e-config\") pod \"neutron-db-sync-fhvrr\" (UID: \"7a6c6de6-8f88-4c87-bd8e-46579996948e\") " pod="openstack/neutron-db-sync-fhvrr" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.783826 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.783844 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.783862 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2eaf7930-34cf-4396-9b94-c09d3a5da09a-db-sync-config-data\") pod \"barbican-db-sync-wtr5d\" (UID: \"2eaf7930-34cf-4396-9b94-c09d3a5da09a\") " pod="openstack/barbican-db-sync-wtr5d" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.789276 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-log-httpd\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.789733 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-run-httpd\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.790140 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2eaf7930-34cf-4396-9b94-c09d3a5da09a-db-sync-config-data\") pod \"barbican-db-sync-wtr5d\" (UID: \"2eaf7930-34cf-4396-9b94-c09d3a5da09a\") " pod="openstack/barbican-db-sync-wtr5d" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.799281 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-config-data\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.800377 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.809409 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-scripts\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.813017 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.813870 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eaf7930-34cf-4396-9b94-c09d3a5da09a-combined-ca-bundle\") pod \"barbican-db-sync-wtr5d\" (UID: \"2eaf7930-34cf-4396-9b94-c09d3a5da09a\") " pod="openstack/barbican-db-sync-wtr5d" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.816182 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqjpm\" (UniqueName: \"kubernetes.io/projected/2eaf7930-34cf-4396-9b94-c09d3a5da09a-kube-api-access-sqjpm\") pod \"barbican-db-sync-wtr5d\" (UID: \"2eaf7930-34cf-4396-9b94-c09d3a5da09a\") " pod="openstack/barbican-db-sync-wtr5d" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.821042 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-swpsq"] Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.841344 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-fw4gh"] Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.852019 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.855698 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6x4j\" (UniqueName: \"kubernetes.io/projected/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-kube-api-access-b6x4j\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.904835 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.918552 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-config-data\") pod \"cinder-db-sync-vz5k2\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.918621 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb2pq\" (UniqueName: \"kubernetes.io/projected/7a6c6de6-8f88-4c87-bd8e-46579996948e-kube-api-access-fb2pq\") pod \"neutron-db-sync-fhvrr\" (UID: \"7a6c6de6-8f88-4c87-bd8e-46579996948e\") " pod="openstack/neutron-db-sync-fhvrr" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.918698 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a6c6de6-8f88-4c87-bd8e-46579996948e-combined-ca-bundle\") pod \"neutron-db-sync-fhvrr\" (UID: \"7a6c6de6-8f88-4c87-bd8e-46579996948e\") " pod="openstack/neutron-db-sync-fhvrr" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.918726 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7a6c6de6-8f88-4c87-bd8e-46579996948e-config\") pod \"neutron-db-sync-fhvrr\" (UID: \"7a6c6de6-8f88-4c87-bd8e-46579996948e\") " pod="openstack/neutron-db-sync-fhvrr" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.918780 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-scripts\") pod \"cinder-db-sync-vz5k2\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.918812 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-db-sync-config-data\") pod \"cinder-db-sync-vz5k2\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.918835 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85gwv\" (UniqueName: \"kubernetes.io/projected/d468a637-b18d-47fd-9b04-910dba72a955-kube-api-access-85gwv\") pod \"cinder-db-sync-vz5k2\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.918865 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-combined-ca-bundle\") pod \"cinder-db-sync-vz5k2\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.918885 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d468a637-b18d-47fd-9b04-910dba72a955-etc-machine-id\") pod \"cinder-db-sync-vz5k2\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.927688 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-fw4gh"] Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.929281 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wtr5d" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.940250 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7a6c6de6-8f88-4c87-bd8e-46579996948e-config\") pod \"neutron-db-sync-fhvrr\" (UID: \"7a6c6de6-8f88-4c87-bd8e-46579996948e\") " pod="openstack/neutron-db-sync-fhvrr" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.954300 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a6c6de6-8f88-4c87-bd8e-46579996948e-combined-ca-bundle\") pod \"neutron-db-sync-fhvrr\" (UID: \"7a6c6de6-8f88-4c87-bd8e-46579996948e\") " pod="openstack/neutron-db-sync-fhvrr" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.960536 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb2pq\" (UniqueName: \"kubernetes.io/projected/7a6c6de6-8f88-4c87-bd8e-46579996948e-kube-api-access-fb2pq\") pod \"neutron-db-sync-fhvrr\" (UID: \"7a6c6de6-8f88-4c87-bd8e-46579996948e\") " pod="openstack/neutron-db-sync-fhvrr" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.974213 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-fh44q"] Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.976089 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fh44q" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.978044 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.979846 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.980531 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-7rq7c" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.010302 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fhvrr" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.042033 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-config-data\") pod \"cinder-db-sync-vz5k2\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.042173 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72a4a042-08eb-4644-81c0-2cfcd105cf2b-scripts\") pod \"placement-db-sync-fh44q\" (UID: \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\") " pod="openstack/placement-db-sync-fh44q" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.042268 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72a4a042-08eb-4644-81c0-2cfcd105cf2b-logs\") pod \"placement-db-sync-fh44q\" (UID: \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\") " pod="openstack/placement-db-sync-fh44q" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.042285 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-fw4gh\" (UID: \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\") " pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.042347 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-fw4gh\" (UID: \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\") " pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.042387 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72a4a042-08eb-4644-81c0-2cfcd105cf2b-combined-ca-bundle\") pod \"placement-db-sync-fh44q\" (UID: \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\") " pod="openstack/placement-db-sync-fh44q" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.042595 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72a4a042-08eb-4644-81c0-2cfcd105cf2b-config-data\") pod \"placement-db-sync-fh44q\" (UID: \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\") " pod="openstack/placement-db-sync-fh44q" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.051806 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpq8g\" (UniqueName: \"kubernetes.io/projected/72a4a042-08eb-4644-81c0-2cfcd105cf2b-kube-api-access-xpq8g\") pod \"placement-db-sync-fh44q\" (UID: \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\") " pod="openstack/placement-db-sync-fh44q" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.051858 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-config\") pod \"dnsmasq-dns-7987f74bbc-fw4gh\" (UID: \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\") " pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.051891 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-scripts\") pod \"cinder-db-sync-vz5k2\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.051965 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-db-sync-config-data\") pod \"cinder-db-sync-vz5k2\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.051992 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85gwv\" (UniqueName: \"kubernetes.io/projected/d468a637-b18d-47fd-9b04-910dba72a955-kube-api-access-85gwv\") pod \"cinder-db-sync-vz5k2\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.052065 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-combined-ca-bundle\") pod \"cinder-db-sync-vz5k2\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.052101 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d468a637-b18d-47fd-9b04-910dba72a955-etc-machine-id\") pod \"cinder-db-sync-vz5k2\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.052128 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bv6f\" (UniqueName: \"kubernetes.io/projected/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-kube-api-access-2bv6f\") pod \"dnsmasq-dns-7987f74bbc-fw4gh\" (UID: \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\") " pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.052192 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-fw4gh\" (UID: \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\") " pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.055612 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-db-sync-config-data\") pod \"cinder-db-sync-vz5k2\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.055854 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d468a637-b18d-47fd-9b04-910dba72a955-etc-machine-id\") pod \"cinder-db-sync-vz5k2\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.058200 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-scripts\") pod \"cinder-db-sync-vz5k2\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.058693 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-combined-ca-bundle\") pod \"cinder-db-sync-vz5k2\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.068913 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-fh44q"] Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.071168 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-config-data\") pod \"cinder-db-sync-vz5k2\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.100975 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85gwv\" (UniqueName: \"kubernetes.io/projected/d468a637-b18d-47fd-9b04-910dba72a955-kube-api-access-85gwv\") pod \"cinder-db-sync-vz5k2\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.154117 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72a4a042-08eb-4644-81c0-2cfcd105cf2b-logs\") pod \"placement-db-sync-fh44q\" (UID: \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\") " pod="openstack/placement-db-sync-fh44q" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.154160 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-fw4gh\" (UID: \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\") " pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.154199 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-fw4gh\" (UID: \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\") " pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.154219 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72a4a042-08eb-4644-81c0-2cfcd105cf2b-combined-ca-bundle\") pod \"placement-db-sync-fh44q\" (UID: \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\") " pod="openstack/placement-db-sync-fh44q" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.154237 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72a4a042-08eb-4644-81c0-2cfcd105cf2b-config-data\") pod \"placement-db-sync-fh44q\" (UID: \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\") " pod="openstack/placement-db-sync-fh44q" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.154256 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpq8g\" (UniqueName: \"kubernetes.io/projected/72a4a042-08eb-4644-81c0-2cfcd105cf2b-kube-api-access-xpq8g\") pod \"placement-db-sync-fh44q\" (UID: \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\") " pod="openstack/placement-db-sync-fh44q" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.154272 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-config\") pod \"dnsmasq-dns-7987f74bbc-fw4gh\" (UID: \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\") " pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.154322 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bv6f\" (UniqueName: \"kubernetes.io/projected/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-kube-api-access-2bv6f\") pod \"dnsmasq-dns-7987f74bbc-fw4gh\" (UID: \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\") " pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.154345 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-fw4gh\" (UID: \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\") " pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.154385 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72a4a042-08eb-4644-81c0-2cfcd105cf2b-scripts\") pod \"placement-db-sync-fh44q\" (UID: \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\") " pod="openstack/placement-db-sync-fh44q" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.157283 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72a4a042-08eb-4644-81c0-2cfcd105cf2b-scripts\") pod \"placement-db-sync-fh44q\" (UID: \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\") " pod="openstack/placement-db-sync-fh44q" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.157597 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72a4a042-08eb-4644-81c0-2cfcd105cf2b-logs\") pod \"placement-db-sync-fh44q\" (UID: \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\") " pod="openstack/placement-db-sync-fh44q" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.158667 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-config\") pod \"dnsmasq-dns-7987f74bbc-fw4gh\" (UID: \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\") " pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.158897 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-fw4gh\" (UID: \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\") " pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.159161 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-fw4gh\" (UID: \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\") " pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.162961 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72a4a042-08eb-4644-81c0-2cfcd105cf2b-config-data\") pod \"placement-db-sync-fh44q\" (UID: \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\") " pod="openstack/placement-db-sync-fh44q" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.163184 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-fw4gh\" (UID: \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\") " pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.163597 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72a4a042-08eb-4644-81c0-2cfcd105cf2b-combined-ca-bundle\") pod \"placement-db-sync-fh44q\" (UID: \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\") " pod="openstack/placement-db-sync-fh44q" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.181232 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpq8g\" (UniqueName: \"kubernetes.io/projected/72a4a042-08eb-4644-81c0-2cfcd105cf2b-kube-api-access-xpq8g\") pod \"placement-db-sync-fh44q\" (UID: \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\") " pod="openstack/placement-db-sync-fh44q" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.187235 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bv6f\" (UniqueName: \"kubernetes.io/projected/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-kube-api-access-2bv6f\") pod \"dnsmasq-dns-7987f74bbc-fw4gh\" (UID: \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\") " pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.352864 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.358066 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.362040 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-swpsq"] Jan 21 14:47:54 crc kubenswrapper[4720]: W0121 14:47:54.371170 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcfe269e5_4b5b_4c08_acd0_2a2218d121f9.slice/crio-9ef3ada754eadcf7f93f4318d7d259dfeee6dbb6d912965144d7d48ff1f35aca WatchSource:0}: Error finding container 9ef3ada754eadcf7f93f4318d7d259dfeee6dbb6d912965144d7d48ff1f35aca: Status 404 returned error can't find the container with id 9ef3ada754eadcf7f93f4318d7d259dfeee6dbb6d912965144d7d48ff1f35aca Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.377013 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fh44q" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.620946 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-sbng7"] Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.661287 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:47:54 crc kubenswrapper[4720]: W0121 14:47:54.684478 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb266c3f_6b50_4953_9f9f_9b41bfc3c4c2.slice/crio-97811cd0d01525b3eadfaadc0174563ab204a73195d95319ec50a029dadf2846 WatchSource:0}: Error finding container 97811cd0d01525b3eadfaadc0174563ab204a73195d95319ec50a029dadf2846: Status 404 returned error can't find the container with id 97811cd0d01525b3eadfaadc0174563ab204a73195d95319ec50a029dadf2846 Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.829328 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-fhvrr"] Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.860993 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-wtr5d"] Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.001406 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2","Type":"ContainerStarted","Data":"97811cd0d01525b3eadfaadc0174563ab204a73195d95319ec50a029dadf2846"} Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.002286 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fhvrr" event={"ID":"7a6c6de6-8f88-4c87-bd8e-46579996948e","Type":"ContainerStarted","Data":"1122f0765aa0595857830e6b23a4de080fa5792bca4d2fa6099c5367b12ae450"} Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.003592 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wtr5d" event={"ID":"2eaf7930-34cf-4396-9b94-c09d3a5da09a","Type":"ContainerStarted","Data":"2b7bc7ab0da041d56c2df3e5b44877c1c08f4a7afb2d85c0210b22ee47e43e82"} Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.005422 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sbng7" event={"ID":"2db18bae-9cc2-4e10-b04c-edd0bb539b79","Type":"ContainerStarted","Data":"8e1d6ab9eaf9f02c816601f4cd4ee4cc6e910d8f782825a76f18664eab273126"} Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.009873 4720 generic.go:334] "Generic (PLEG): container finished" podID="cfe269e5-4b5b-4c08-acd0-2a2218d121f9" containerID="8d7b3178a9503cf3c70be6d2af65ac2abae3603160af505b1963e47f85c4e8ae" exitCode=0 Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.009900 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6546db6db7-swpsq" event={"ID":"cfe269e5-4b5b-4c08-acd0-2a2218d121f9","Type":"ContainerDied","Data":"8d7b3178a9503cf3c70be6d2af65ac2abae3603160af505b1963e47f85c4e8ae"} Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.009915 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6546db6db7-swpsq" event={"ID":"cfe269e5-4b5b-4c08-acd0-2a2218d121f9","Type":"ContainerStarted","Data":"9ef3ada754eadcf7f93f4318d7d259dfeee6dbb6d912965144d7d48ff1f35aca"} Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.053218 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-vz5k2"] Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.071675 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-fw4gh"] Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.188923 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-fh44q"] Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.488111 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-swpsq" Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.677959 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.688617 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-ovsdbserver-sb\") pod \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\" (UID: \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\") " Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.688684 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-ovsdbserver-nb\") pod \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\" (UID: \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\") " Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.688807 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-dns-svc\") pod \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\" (UID: \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\") " Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.688836 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lpjd\" (UniqueName: \"kubernetes.io/projected/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-kube-api-access-5lpjd\") pod \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\" (UID: \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\") " Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.688866 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-config\") pod \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\" (UID: \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\") " Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.703111 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-kube-api-access-5lpjd" (OuterVolumeSpecName: "kube-api-access-5lpjd") pod "cfe269e5-4b5b-4c08-acd0-2a2218d121f9" (UID: "cfe269e5-4b5b-4c08-acd0-2a2218d121f9"). InnerVolumeSpecName "kube-api-access-5lpjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.733916 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cfe269e5-4b5b-4c08-acd0-2a2218d121f9" (UID: "cfe269e5-4b5b-4c08-acd0-2a2218d121f9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.740463 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cfe269e5-4b5b-4c08-acd0-2a2218d121f9" (UID: "cfe269e5-4b5b-4c08-acd0-2a2218d121f9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.754774 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cfe269e5-4b5b-4c08-acd0-2a2218d121f9" (UID: "cfe269e5-4b5b-4c08-acd0-2a2218d121f9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.783340 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-config" (OuterVolumeSpecName: "config") pod "cfe269e5-4b5b-4c08-acd0-2a2218d121f9" (UID: "cfe269e5-4b5b-4c08-acd0-2a2218d121f9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.792339 4720 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.792461 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lpjd\" (UniqueName: \"kubernetes.io/projected/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-kube-api-access-5lpjd\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.792534 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.792720 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.792797 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:56 crc kubenswrapper[4720]: I0121 14:47:56.018575 4720 generic.go:334] "Generic (PLEG): container finished" podID="6a60b31b-eca6-4e2d-8dcd-0097033a8a35" containerID="e46f112e6ec1160fcf32ab53a073a80fc27df442931a4ca3b0306e8fe8981868" exitCode=0 Jan 21 14:47:56 crc kubenswrapper[4720]: I0121 14:47:56.018641 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" event={"ID":"6a60b31b-eca6-4e2d-8dcd-0097033a8a35","Type":"ContainerDied","Data":"e46f112e6ec1160fcf32ab53a073a80fc27df442931a4ca3b0306e8fe8981868"} Jan 21 14:47:56 crc kubenswrapper[4720]: I0121 14:47:56.018700 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" event={"ID":"6a60b31b-eca6-4e2d-8dcd-0097033a8a35","Type":"ContainerStarted","Data":"f174f6fdc28ef67dfde4e4bef9624d4a07461dcbbd47420bdd6d732525e7403e"} Jan 21 14:47:56 crc kubenswrapper[4720]: I0121 14:47:56.022451 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sbng7" event={"ID":"2db18bae-9cc2-4e10-b04c-edd0bb539b79","Type":"ContainerStarted","Data":"bc4beac3df68c3a4d150eba1728e09c2fdcdca24969df6e9d7185b1713f0ae4f"} Jan 21 14:47:56 crc kubenswrapper[4720]: I0121 14:47:56.071398 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6546db6db7-swpsq" event={"ID":"cfe269e5-4b5b-4c08-acd0-2a2218d121f9","Type":"ContainerDied","Data":"9ef3ada754eadcf7f93f4318d7d259dfeee6dbb6d912965144d7d48ff1f35aca"} Jan 21 14:47:56 crc kubenswrapper[4720]: I0121 14:47:56.071444 4720 scope.go:117] "RemoveContainer" containerID="8d7b3178a9503cf3c70be6d2af65ac2abae3603160af505b1963e47f85c4e8ae" Jan 21 14:47:56 crc kubenswrapper[4720]: I0121 14:47:56.071601 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-swpsq" Jan 21 14:47:56 crc kubenswrapper[4720]: I0121 14:47:56.105512 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-sbng7" podStartSLOduration=3.105495672 podStartE2EDuration="3.105495672s" podCreationTimestamp="2026-01-21 14:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:47:56.096027767 +0000 UTC m=+1114.004767699" watchObservedRunningTime="2026-01-21 14:47:56.105495672 +0000 UTC m=+1114.014235604" Jan 21 14:47:56 crc kubenswrapper[4720]: I0121 14:47:56.119488 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vz5k2" event={"ID":"d468a637-b18d-47fd-9b04-910dba72a955","Type":"ContainerStarted","Data":"acf7d0f328c5178aaf28b6696e5f846b6ac605dd876f51d9605333f5abd8e705"} Jan 21 14:47:56 crc kubenswrapper[4720]: I0121 14:47:56.128683 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fhvrr" event={"ID":"7a6c6de6-8f88-4c87-bd8e-46579996948e","Type":"ContainerStarted","Data":"4cbcc32aeb798aaa7b0d77c7b3bd3ce53ec4708a0626d5329246f49a64fe4d07"} Jan 21 14:47:56 crc kubenswrapper[4720]: I0121 14:47:56.130587 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fh44q" event={"ID":"72a4a042-08eb-4644-81c0-2cfcd105cf2b","Type":"ContainerStarted","Data":"45355f1df1e84e368508f9990990c5cc32fac5fefbdc4cf2346851c86675e699"} Jan 21 14:47:56 crc kubenswrapper[4720]: I0121 14:47:56.149788 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-fhvrr" podStartSLOduration=3.149770554 podStartE2EDuration="3.149770554s" podCreationTimestamp="2026-01-21 14:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:47:56.145382131 +0000 UTC m=+1114.054122053" watchObservedRunningTime="2026-01-21 14:47:56.149770554 +0000 UTC m=+1114.058510486" Jan 21 14:47:56 crc kubenswrapper[4720]: I0121 14:47:56.206883 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-swpsq"] Jan 21 14:47:56 crc kubenswrapper[4720]: I0121 14:47:56.231784 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-swpsq"] Jan 21 14:47:56 crc kubenswrapper[4720]: I0121 14:47:56.706345 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfe269e5-4b5b-4c08-acd0-2a2218d121f9" path="/var/lib/kubelet/pods/cfe269e5-4b5b-4c08-acd0-2a2218d121f9/volumes" Jan 21 14:47:57 crc kubenswrapper[4720]: I0121 14:47:57.141066 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" event={"ID":"6a60b31b-eca6-4e2d-8dcd-0097033a8a35","Type":"ContainerStarted","Data":"807889b31df7e2b2850227042ad536d3c2a4d3c7125a563f44e02b57c276cf17"} Jan 21 14:47:57 crc kubenswrapper[4720]: I0121 14:47:57.142369 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:47:57 crc kubenswrapper[4720]: I0121 14:47:57.163589 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" podStartSLOduration=4.163116935 podStartE2EDuration="4.163116935s" podCreationTimestamp="2026-01-21 14:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:47:57.161703958 +0000 UTC m=+1115.070443890" watchObservedRunningTime="2026-01-21 14:47:57.163116935 +0000 UTC m=+1115.071856857" Jan 21 14:48:01 crc kubenswrapper[4720]: I0121 14:48:01.203346 4720 generic.go:334] "Generic (PLEG): container finished" podID="2db18bae-9cc2-4e10-b04c-edd0bb539b79" containerID="bc4beac3df68c3a4d150eba1728e09c2fdcdca24969df6e9d7185b1713f0ae4f" exitCode=0 Jan 21 14:48:01 crc kubenswrapper[4720]: I0121 14:48:01.203380 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sbng7" event={"ID":"2db18bae-9cc2-4e10-b04c-edd0bb539b79","Type":"ContainerDied","Data":"bc4beac3df68c3a4d150eba1728e09c2fdcdca24969df6e9d7185b1713f0ae4f"} Jan 21 14:48:04 crc kubenswrapper[4720]: I0121 14:48:04.360463 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:48:04 crc kubenswrapper[4720]: I0121 14:48:04.416893 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-7dx9d"] Jan 21 14:48:04 crc kubenswrapper[4720]: I0121 14:48:04.417168 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" podUID="e7c8b46b-d758-4538-a345-21ccc71aabe4" containerName="dnsmasq-dns" containerID="cri-o://e0a764a335a04c77ad3eddd776f3c96760f97db3ae9ca0277c3d3c8d73645666" gracePeriod=10 Jan 21 14:48:05 crc kubenswrapper[4720]: I0121 14:48:05.238103 4720 generic.go:334] "Generic (PLEG): container finished" podID="e7c8b46b-d758-4538-a345-21ccc71aabe4" containerID="e0a764a335a04c77ad3eddd776f3c96760f97db3ae9ca0277c3d3c8d73645666" exitCode=0 Jan 21 14:48:05 crc kubenswrapper[4720]: I0121 14:48:05.238145 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" event={"ID":"e7c8b46b-d758-4538-a345-21ccc71aabe4","Type":"ContainerDied","Data":"e0a764a335a04c77ad3eddd776f3c96760f97db3ae9ca0277c3d3c8d73645666"} Jan 21 14:48:05 crc kubenswrapper[4720]: I0121 14:48:05.544854 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" podUID="e7c8b46b-d758-4538-a345-21ccc71aabe4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: connect: connection refused" Jan 21 14:48:06 crc kubenswrapper[4720]: I0121 14:48:06.629024 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:48:06 crc kubenswrapper[4720]: I0121 14:48:06.656540 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-combined-ca-bundle\") pod \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " Jan 21 14:48:06 crc kubenswrapper[4720]: I0121 14:48:06.656621 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-credential-keys\") pod \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " Jan 21 14:48:06 crc kubenswrapper[4720]: I0121 14:48:06.656691 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-config-data\") pod \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " Jan 21 14:48:06 crc kubenswrapper[4720]: I0121 14:48:06.656721 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-fernet-keys\") pod \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " Jan 21 14:48:06 crc kubenswrapper[4720]: I0121 14:48:06.656745 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-scripts\") pod \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " Jan 21 14:48:06 crc kubenswrapper[4720]: I0121 14:48:06.656809 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbvd8\" (UniqueName: \"kubernetes.io/projected/2db18bae-9cc2-4e10-b04c-edd0bb539b79-kube-api-access-sbvd8\") pod \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " Jan 21 14:48:06 crc kubenswrapper[4720]: I0121 14:48:06.701009 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2db18bae-9cc2-4e10-b04c-edd0bb539b79" (UID: "2db18bae-9cc2-4e10-b04c-edd0bb539b79"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:06 crc kubenswrapper[4720]: I0121 14:48:06.701027 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2db18bae-9cc2-4e10-b04c-edd0bb539b79" (UID: "2db18bae-9cc2-4e10-b04c-edd0bb539b79"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:06 crc kubenswrapper[4720]: I0121 14:48:06.703091 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2db18bae-9cc2-4e10-b04c-edd0bb539b79-kube-api-access-sbvd8" (OuterVolumeSpecName: "kube-api-access-sbvd8") pod "2db18bae-9cc2-4e10-b04c-edd0bb539b79" (UID: "2db18bae-9cc2-4e10-b04c-edd0bb539b79"). InnerVolumeSpecName "kube-api-access-sbvd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:48:06 crc kubenswrapper[4720]: I0121 14:48:06.700737 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-scripts" (OuterVolumeSpecName: "scripts") pod "2db18bae-9cc2-4e10-b04c-edd0bb539b79" (UID: "2db18bae-9cc2-4e10-b04c-edd0bb539b79"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:06 crc kubenswrapper[4720]: I0121 14:48:06.716011 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-config-data" (OuterVolumeSpecName: "config-data") pod "2db18bae-9cc2-4e10-b04c-edd0bb539b79" (UID: "2db18bae-9cc2-4e10-b04c-edd0bb539b79"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:06 crc kubenswrapper[4720]: I0121 14:48:06.728886 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2db18bae-9cc2-4e10-b04c-edd0bb539b79" (UID: "2db18bae-9cc2-4e10-b04c-edd0bb539b79"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:06 crc kubenswrapper[4720]: I0121 14:48:06.758275 4720 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:06 crc kubenswrapper[4720]: I0121 14:48:06.758313 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbvd8\" (UniqueName: \"kubernetes.io/projected/2db18bae-9cc2-4e10-b04c-edd0bb539b79-kube-api-access-sbvd8\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:06 crc kubenswrapper[4720]: I0121 14:48:06.758327 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:06 crc kubenswrapper[4720]: I0121 14:48:06.758341 4720 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:06 crc kubenswrapper[4720]: I0121 14:48:06.758352 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:06 crc kubenswrapper[4720]: I0121 14:48:06.758363 4720 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.265747 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sbng7" event={"ID":"2db18bae-9cc2-4e10-b04c-edd0bb539b79","Type":"ContainerDied","Data":"8e1d6ab9eaf9f02c816601f4cd4ee4cc6e910d8f782825a76f18664eab273126"} Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.266063 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e1d6ab9eaf9f02c816601f4cd4ee4cc6e910d8f782825a76f18664eab273126" Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.265886 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.743980 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-sbng7"] Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.750903 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-sbng7"] Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.805354 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-lsn2k"] Jan 21 14:48:07 crc kubenswrapper[4720]: E0121 14:48:07.805905 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfe269e5-4b5b-4c08-acd0-2a2218d121f9" containerName="init" Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.805918 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfe269e5-4b5b-4c08-acd0-2a2218d121f9" containerName="init" Jan 21 14:48:07 crc kubenswrapper[4720]: E0121 14:48:07.805936 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2db18bae-9cc2-4e10-b04c-edd0bb539b79" containerName="keystone-bootstrap" Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.805943 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="2db18bae-9cc2-4e10-b04c-edd0bb539b79" containerName="keystone-bootstrap" Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.806138 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="2db18bae-9cc2-4e10-b04c-edd0bb539b79" containerName="keystone-bootstrap" Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.806159 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfe269e5-4b5b-4c08-acd0-2a2218d121f9" containerName="init" Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.806715 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.809301 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pb4wb" Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.809521 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.809644 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.810322 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.810613 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.815004 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-lsn2k"] Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.988116 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnr6s\" (UniqueName: \"kubernetes.io/projected/03e400cd-53d2-4738-96f0-75829e339879-kube-api-access-mnr6s\") pod \"keystone-bootstrap-lsn2k\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.988198 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-config-data\") pod \"keystone-bootstrap-lsn2k\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.988270 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-combined-ca-bundle\") pod \"keystone-bootstrap-lsn2k\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.988342 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-fernet-keys\") pod \"keystone-bootstrap-lsn2k\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.988371 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-credential-keys\") pod \"keystone-bootstrap-lsn2k\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.988397 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-scripts\") pod \"keystone-bootstrap-lsn2k\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:08 crc kubenswrapper[4720]: I0121 14:48:08.090053 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnr6s\" (UniqueName: \"kubernetes.io/projected/03e400cd-53d2-4738-96f0-75829e339879-kube-api-access-mnr6s\") pod \"keystone-bootstrap-lsn2k\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:08 crc kubenswrapper[4720]: I0121 14:48:08.090577 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-config-data\") pod \"keystone-bootstrap-lsn2k\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:08 crc kubenswrapper[4720]: I0121 14:48:08.090643 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-combined-ca-bundle\") pod \"keystone-bootstrap-lsn2k\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:08 crc kubenswrapper[4720]: I0121 14:48:08.090730 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-fernet-keys\") pod \"keystone-bootstrap-lsn2k\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:08 crc kubenswrapper[4720]: I0121 14:48:08.090757 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-credential-keys\") pod \"keystone-bootstrap-lsn2k\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:08 crc kubenswrapper[4720]: I0121 14:48:08.090776 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-scripts\") pod \"keystone-bootstrap-lsn2k\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:08 crc kubenswrapper[4720]: I0121 14:48:08.096305 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-credential-keys\") pod \"keystone-bootstrap-lsn2k\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:08 crc kubenswrapper[4720]: I0121 14:48:08.097087 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-fernet-keys\") pod \"keystone-bootstrap-lsn2k\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:08 crc kubenswrapper[4720]: I0121 14:48:08.099991 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-combined-ca-bundle\") pod \"keystone-bootstrap-lsn2k\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:08 crc kubenswrapper[4720]: I0121 14:48:08.104824 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-config-data\") pod \"keystone-bootstrap-lsn2k\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:08 crc kubenswrapper[4720]: I0121 14:48:08.107899 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-scripts\") pod \"keystone-bootstrap-lsn2k\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:08 crc kubenswrapper[4720]: I0121 14:48:08.114057 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnr6s\" (UniqueName: \"kubernetes.io/projected/03e400cd-53d2-4738-96f0-75829e339879-kube-api-access-mnr6s\") pod \"keystone-bootstrap-lsn2k\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:08 crc kubenswrapper[4720]: I0121 14:48:08.133917 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:08 crc kubenswrapper[4720]: I0121 14:48:08.692677 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2db18bae-9cc2-4e10-b04c-edd0bb539b79" path="/var/lib/kubelet/pods/2db18bae-9cc2-4e10-b04c-edd0bb539b79/volumes" Jan 21 14:48:10 crc kubenswrapper[4720]: I0121 14:48:10.545304 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" podUID="e7c8b46b-d758-4538-a345-21ccc71aabe4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: connect: connection refused" Jan 21 14:48:15 crc kubenswrapper[4720]: E0121 14:48:15.541633 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Jan 21 14:48:15 crc kubenswrapper[4720]: E0121 14:48:15.542165 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sqjpm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-wtr5d_openstack(2eaf7930-34cf-4396-9b94-c09d3a5da09a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:48:15 crc kubenswrapper[4720]: E0121 14:48:15.544227 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-wtr5d" podUID="2eaf7930-34cf-4396-9b94-c09d3a5da09a" Jan 21 14:48:15 crc kubenswrapper[4720]: I0121 14:48:15.545680 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" podUID="e7c8b46b-d758-4538-a345-21ccc71aabe4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: connect: connection refused" Jan 21 14:48:15 crc kubenswrapper[4720]: I0121 14:48:15.545789 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:48:16 crc kubenswrapper[4720]: E0121 14:48:16.385287 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-wtr5d" podUID="2eaf7930-34cf-4396-9b94-c09d3a5da09a" Jan 21 14:48:16 crc kubenswrapper[4720]: E0121 14:48:16.766732 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Jan 21 14:48:16 crc kubenswrapper[4720]: E0121 14:48:16.767174 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-85gwv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-vz5k2_openstack(d468a637-b18d-47fd-9b04-910dba72a955): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:48:16 crc kubenswrapper[4720]: E0121 14:48:16.768492 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-vz5k2" podUID="d468a637-b18d-47fd-9b04-910dba72a955" Jan 21 14:48:16 crc kubenswrapper[4720]: I0121 14:48:16.929684 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.059637 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6v27l\" (UniqueName: \"kubernetes.io/projected/e7c8b46b-d758-4538-a345-21ccc71aabe4-kube-api-access-6v27l\") pod \"e7c8b46b-d758-4538-a345-21ccc71aabe4\" (UID: \"e7c8b46b-d758-4538-a345-21ccc71aabe4\") " Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.059974 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-ovsdbserver-nb\") pod \"e7c8b46b-d758-4538-a345-21ccc71aabe4\" (UID: \"e7c8b46b-d758-4538-a345-21ccc71aabe4\") " Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.060024 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-config\") pod \"e7c8b46b-d758-4538-a345-21ccc71aabe4\" (UID: \"e7c8b46b-d758-4538-a345-21ccc71aabe4\") " Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.060075 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-ovsdbserver-sb\") pod \"e7c8b46b-d758-4538-a345-21ccc71aabe4\" (UID: \"e7c8b46b-d758-4538-a345-21ccc71aabe4\") " Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.060094 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-dns-svc\") pod \"e7c8b46b-d758-4538-a345-21ccc71aabe4\" (UID: \"e7c8b46b-d758-4538-a345-21ccc71aabe4\") " Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.071010 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7c8b46b-d758-4538-a345-21ccc71aabe4-kube-api-access-6v27l" (OuterVolumeSpecName: "kube-api-access-6v27l") pod "e7c8b46b-d758-4538-a345-21ccc71aabe4" (UID: "e7c8b46b-d758-4538-a345-21ccc71aabe4"). InnerVolumeSpecName "kube-api-access-6v27l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.105445 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e7c8b46b-d758-4538-a345-21ccc71aabe4" (UID: "e7c8b46b-d758-4538-a345-21ccc71aabe4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.107148 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e7c8b46b-d758-4538-a345-21ccc71aabe4" (UID: "e7c8b46b-d758-4538-a345-21ccc71aabe4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.111022 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-config" (OuterVolumeSpecName: "config") pod "e7c8b46b-d758-4538-a345-21ccc71aabe4" (UID: "e7c8b46b-d758-4538-a345-21ccc71aabe4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.117034 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e7c8b46b-d758-4538-a345-21ccc71aabe4" (UID: "e7c8b46b-d758-4538-a345-21ccc71aabe4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.161733 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6v27l\" (UniqueName: \"kubernetes.io/projected/e7c8b46b-d758-4538-a345-21ccc71aabe4-kube-api-access-6v27l\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.161769 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.161781 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.161792 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.161806 4720 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.175006 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-lsn2k"] Jan 21 14:48:17 crc kubenswrapper[4720]: W0121 14:48:17.185751 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03e400cd_53d2_4738_96f0_75829e339879.slice/crio-914afd90c63529f4374d55c95eb1d303cd6e2f2553422262b39f93a0e25feee8 WatchSource:0}: Error finding container 914afd90c63529f4374d55c95eb1d303cd6e2f2553422262b39f93a0e25feee8: Status 404 returned error can't find the container with id 914afd90c63529f4374d55c95eb1d303cd6e2f2553422262b39f93a0e25feee8 Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.391795 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" event={"ID":"e7c8b46b-d758-4538-a345-21ccc71aabe4","Type":"ContainerDied","Data":"e1f0107509de082453c41da25f574e86b230a27ade7a48580805b9ac2072e586"} Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.391827 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.392139 4720 scope.go:117] "RemoveContainer" containerID="e0a764a335a04c77ad3eddd776f3c96760f97db3ae9ca0277c3d3c8d73645666" Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.398043 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2","Type":"ContainerStarted","Data":"5ce080112e1278b271b9eee49e4451f7cdf38f35f931e7ced0dfd13eaa4d909c"} Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.402030 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lsn2k" event={"ID":"03e400cd-53d2-4738-96f0-75829e339879","Type":"ContainerStarted","Data":"4e24f13f0ad5e20e473814aa465820a276b981501127e963947f9007b3bccb91"} Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.402062 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lsn2k" event={"ID":"03e400cd-53d2-4738-96f0-75829e339879","Type":"ContainerStarted","Data":"914afd90c63529f4374d55c95eb1d303cd6e2f2553422262b39f93a0e25feee8"} Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.406257 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fh44q" event={"ID":"72a4a042-08eb-4644-81c0-2cfcd105cf2b","Type":"ContainerStarted","Data":"e157ae31f96b07ea02b29f98dae94eb3d7d5795415a495d65698b9a5085c7130"} Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.416047 4720 scope.go:117] "RemoveContainer" containerID="856c89063d4a8ffd0dea2ed8327f087d5552a6933c06f250049624af9b370e87" Jan 21 14:48:17 crc kubenswrapper[4720]: E0121 14:48:17.416431 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-vz5k2" podUID="d468a637-b18d-47fd-9b04-910dba72a955" Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.435765 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-lsn2k" podStartSLOduration=10.435737429 podStartE2EDuration="10.435737429s" podCreationTimestamp="2026-01-21 14:48:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:48:17.428220815 +0000 UTC m=+1135.336960787" watchObservedRunningTime="2026-01-21 14:48:17.435737429 +0000 UTC m=+1135.344477391" Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.477574 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-7dx9d"] Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.487251 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-7dx9d"] Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.493022 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-fh44q" podStartSLOduration=2.956255761 podStartE2EDuration="24.493003148s" podCreationTimestamp="2026-01-21 14:47:53 +0000 UTC" firstStartedPulling="2026-01-21 14:47:55.188357806 +0000 UTC m=+1113.097097728" lastFinishedPulling="2026-01-21 14:48:16.725105183 +0000 UTC m=+1134.633845115" observedRunningTime="2026-01-21 14:48:17.485113494 +0000 UTC m=+1135.393853446" watchObservedRunningTime="2026-01-21 14:48:17.493003148 +0000 UTC m=+1135.401743080" Jan 21 14:48:18 crc kubenswrapper[4720]: I0121 14:48:18.418413 4720 generic.go:334] "Generic (PLEG): container finished" podID="7a6c6de6-8f88-4c87-bd8e-46579996948e" containerID="4cbcc32aeb798aaa7b0d77c7b3bd3ce53ec4708a0626d5329246f49a64fe4d07" exitCode=0 Jan 21 14:48:18 crc kubenswrapper[4720]: I0121 14:48:18.419700 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fhvrr" event={"ID":"7a6c6de6-8f88-4c87-bd8e-46579996948e","Type":"ContainerDied","Data":"4cbcc32aeb798aaa7b0d77c7b3bd3ce53ec4708a0626d5329246f49a64fe4d07"} Jan 21 14:48:18 crc kubenswrapper[4720]: I0121 14:48:18.690197 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7c8b46b-d758-4538-a345-21ccc71aabe4" path="/var/lib/kubelet/pods/e7c8b46b-d758-4538-a345-21ccc71aabe4/volumes" Jan 21 14:48:19 crc kubenswrapper[4720]: I0121 14:48:19.427965 4720 generic.go:334] "Generic (PLEG): container finished" podID="72a4a042-08eb-4644-81c0-2cfcd105cf2b" containerID="e157ae31f96b07ea02b29f98dae94eb3d7d5795415a495d65698b9a5085c7130" exitCode=0 Jan 21 14:48:19 crc kubenswrapper[4720]: I0121 14:48:19.428324 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fh44q" event={"ID":"72a4a042-08eb-4644-81c0-2cfcd105cf2b","Type":"ContainerDied","Data":"e157ae31f96b07ea02b29f98dae94eb3d7d5795415a495d65698b9a5085c7130"} Jan 21 14:48:19 crc kubenswrapper[4720]: I0121 14:48:19.438884 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2","Type":"ContainerStarted","Data":"ae2697d4de89a62c9158e02846068cdf4d641ca5c8dccf5ea2f2fd831333be23"} Jan 21 14:48:19 crc kubenswrapper[4720]: I0121 14:48:19.760795 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fhvrr" Jan 21 14:48:19 crc kubenswrapper[4720]: I0121 14:48:19.845922 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fb2pq\" (UniqueName: \"kubernetes.io/projected/7a6c6de6-8f88-4c87-bd8e-46579996948e-kube-api-access-fb2pq\") pod \"7a6c6de6-8f88-4c87-bd8e-46579996948e\" (UID: \"7a6c6de6-8f88-4c87-bd8e-46579996948e\") " Jan 21 14:48:19 crc kubenswrapper[4720]: I0121 14:48:19.845994 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7a6c6de6-8f88-4c87-bd8e-46579996948e-config\") pod \"7a6c6de6-8f88-4c87-bd8e-46579996948e\" (UID: \"7a6c6de6-8f88-4c87-bd8e-46579996948e\") " Jan 21 14:48:19 crc kubenswrapper[4720]: I0121 14:48:19.846082 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a6c6de6-8f88-4c87-bd8e-46579996948e-combined-ca-bundle\") pod \"7a6c6de6-8f88-4c87-bd8e-46579996948e\" (UID: \"7a6c6de6-8f88-4c87-bd8e-46579996948e\") " Jan 21 14:48:19 crc kubenswrapper[4720]: I0121 14:48:19.851975 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a6c6de6-8f88-4c87-bd8e-46579996948e-kube-api-access-fb2pq" (OuterVolumeSpecName: "kube-api-access-fb2pq") pod "7a6c6de6-8f88-4c87-bd8e-46579996948e" (UID: "7a6c6de6-8f88-4c87-bd8e-46579996948e"). InnerVolumeSpecName "kube-api-access-fb2pq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:48:19 crc kubenswrapper[4720]: I0121 14:48:19.875837 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a6c6de6-8f88-4c87-bd8e-46579996948e-config" (OuterVolumeSpecName: "config") pod "7a6c6de6-8f88-4c87-bd8e-46579996948e" (UID: "7a6c6de6-8f88-4c87-bd8e-46579996948e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:19 crc kubenswrapper[4720]: I0121 14:48:19.884017 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a6c6de6-8f88-4c87-bd8e-46579996948e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a6c6de6-8f88-4c87-bd8e-46579996948e" (UID: "7a6c6de6-8f88-4c87-bd8e-46579996948e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:19 crc kubenswrapper[4720]: I0121 14:48:19.947197 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a6c6de6-8f88-4c87-bd8e-46579996948e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:19 crc kubenswrapper[4720]: I0121 14:48:19.947236 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fb2pq\" (UniqueName: \"kubernetes.io/projected/7a6c6de6-8f88-4c87-bd8e-46579996948e-kube-api-access-fb2pq\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:19 crc kubenswrapper[4720]: I0121 14:48:19.947247 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7a6c6de6-8f88-4c87-bd8e-46579996948e-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.450867 4720 generic.go:334] "Generic (PLEG): container finished" podID="03e400cd-53d2-4738-96f0-75829e339879" containerID="4e24f13f0ad5e20e473814aa465820a276b981501127e963947f9007b3bccb91" exitCode=0 Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.450961 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lsn2k" event={"ID":"03e400cd-53d2-4738-96f0-75829e339879","Type":"ContainerDied","Data":"4e24f13f0ad5e20e473814aa465820a276b981501127e963947f9007b3bccb91"} Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.457819 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fhvrr" event={"ID":"7a6c6de6-8f88-4c87-bd8e-46579996948e","Type":"ContainerDied","Data":"1122f0765aa0595857830e6b23a4de080fa5792bca4d2fa6099c5367b12ae450"} Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.457858 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1122f0765aa0595857830e6b23a4de080fa5792bca4d2fa6099c5367b12ae450" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.457981 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fhvrr" Jan 21 14:48:20 crc kubenswrapper[4720]: E0121 14:48:20.625626 4720 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a6c6de6_8f88_4c87_bd8e_46579996948e.slice\": RecentStats: unable to find data in memory cache]" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.649058 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-99cnk"] Jan 21 14:48:20 crc kubenswrapper[4720]: E0121 14:48:20.649356 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a6c6de6-8f88-4c87-bd8e-46579996948e" containerName="neutron-db-sync" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.649369 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a6c6de6-8f88-4c87-bd8e-46579996948e" containerName="neutron-db-sync" Jan 21 14:48:20 crc kubenswrapper[4720]: E0121 14:48:20.649377 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7c8b46b-d758-4538-a345-21ccc71aabe4" containerName="dnsmasq-dns" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.649385 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7c8b46b-d758-4538-a345-21ccc71aabe4" containerName="dnsmasq-dns" Jan 21 14:48:20 crc kubenswrapper[4720]: E0121 14:48:20.649394 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7c8b46b-d758-4538-a345-21ccc71aabe4" containerName="init" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.649400 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7c8b46b-d758-4538-a345-21ccc71aabe4" containerName="init" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.649544 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a6c6de6-8f88-4c87-bd8e-46579996948e" containerName="neutron-db-sync" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.649554 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7c8b46b-d758-4538-a345-21ccc71aabe4" containerName="dnsmasq-dns" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.650279 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.674549 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-99cnk"] Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.777467 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-ovsdbserver-sb\") pod \"dnsmasq-dns-7b946d459c-99cnk\" (UID: \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\") " pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.777722 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-config\") pod \"dnsmasq-dns-7b946d459c-99cnk\" (UID: \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\") " pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.777901 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp7tz\" (UniqueName: \"kubernetes.io/projected/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-kube-api-access-lp7tz\") pod \"dnsmasq-dns-7b946d459c-99cnk\" (UID: \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\") " pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.777996 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-dns-svc\") pod \"dnsmasq-dns-7b946d459c-99cnk\" (UID: \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\") " pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.778117 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-ovsdbserver-nb\") pod \"dnsmasq-dns-7b946d459c-99cnk\" (UID: \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\") " pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.880540 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp7tz\" (UniqueName: \"kubernetes.io/projected/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-kube-api-access-lp7tz\") pod \"dnsmasq-dns-7b946d459c-99cnk\" (UID: \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\") " pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.881249 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-dns-svc\") pod \"dnsmasq-dns-7b946d459c-99cnk\" (UID: \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\") " pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.881457 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-ovsdbserver-nb\") pod \"dnsmasq-dns-7b946d459c-99cnk\" (UID: \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\") " pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.881683 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-ovsdbserver-sb\") pod \"dnsmasq-dns-7b946d459c-99cnk\" (UID: \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\") " pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.881884 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-config\") pod \"dnsmasq-dns-7b946d459c-99cnk\" (UID: \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\") " pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.883590 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-config\") pod \"dnsmasq-dns-7b946d459c-99cnk\" (UID: \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\") " pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.884194 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-ovsdbserver-sb\") pod \"dnsmasq-dns-7b946d459c-99cnk\" (UID: \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\") " pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.884499 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-ovsdbserver-nb\") pod \"dnsmasq-dns-7b946d459c-99cnk\" (UID: \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\") " pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.884838 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-dns-svc\") pod \"dnsmasq-dns-7b946d459c-99cnk\" (UID: \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\") " pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.918769 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7f78c5dfcb-hsblf"] Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.920061 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.931667 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp7tz\" (UniqueName: \"kubernetes.io/projected/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-kube-api-access-lp7tz\") pod \"dnsmasq-dns-7b946d459c-99cnk\" (UID: \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\") " pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.944347 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.944553 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.945021 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.945125 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-r779p" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.970638 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7f78c5dfcb-hsblf"] Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.987071 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-ovndb-tls-certs\") pod \"neutron-7f78c5dfcb-hsblf\" (UID: \"eef8d65a-fa41-4368-8368-4b50935db576\") " pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.987132 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjrlh\" (UniqueName: \"kubernetes.io/projected/eef8d65a-fa41-4368-8368-4b50935db576-kube-api-access-kjrlh\") pod \"neutron-7f78c5dfcb-hsblf\" (UID: \"eef8d65a-fa41-4368-8368-4b50935db576\") " pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.987176 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-config\") pod \"neutron-7f78c5dfcb-hsblf\" (UID: \"eef8d65a-fa41-4368-8368-4b50935db576\") " pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.987235 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-httpd-config\") pod \"neutron-7f78c5dfcb-hsblf\" (UID: \"eef8d65a-fa41-4368-8368-4b50935db576\") " pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.987262 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-combined-ca-bundle\") pod \"neutron-7f78c5dfcb-hsblf\" (UID: \"eef8d65a-fa41-4368-8368-4b50935db576\") " pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.006835 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.089580 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-ovndb-tls-certs\") pod \"neutron-7f78c5dfcb-hsblf\" (UID: \"eef8d65a-fa41-4368-8368-4b50935db576\") " pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.089635 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjrlh\" (UniqueName: \"kubernetes.io/projected/eef8d65a-fa41-4368-8368-4b50935db576-kube-api-access-kjrlh\") pod \"neutron-7f78c5dfcb-hsblf\" (UID: \"eef8d65a-fa41-4368-8368-4b50935db576\") " pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.089687 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-config\") pod \"neutron-7f78c5dfcb-hsblf\" (UID: \"eef8d65a-fa41-4368-8368-4b50935db576\") " pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.089740 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-httpd-config\") pod \"neutron-7f78c5dfcb-hsblf\" (UID: \"eef8d65a-fa41-4368-8368-4b50935db576\") " pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.089758 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-combined-ca-bundle\") pod \"neutron-7f78c5dfcb-hsblf\" (UID: \"eef8d65a-fa41-4368-8368-4b50935db576\") " pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.098159 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-ovndb-tls-certs\") pod \"neutron-7f78c5dfcb-hsblf\" (UID: \"eef8d65a-fa41-4368-8368-4b50935db576\") " pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.103030 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-combined-ca-bundle\") pod \"neutron-7f78c5dfcb-hsblf\" (UID: \"eef8d65a-fa41-4368-8368-4b50935db576\") " pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.103707 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-httpd-config\") pod \"neutron-7f78c5dfcb-hsblf\" (UID: \"eef8d65a-fa41-4368-8368-4b50935db576\") " pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.133514 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-config\") pod \"neutron-7f78c5dfcb-hsblf\" (UID: \"eef8d65a-fa41-4368-8368-4b50935db576\") " pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.133527 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjrlh\" (UniqueName: \"kubernetes.io/projected/eef8d65a-fa41-4368-8368-4b50935db576-kube-api-access-kjrlh\") pod \"neutron-7f78c5dfcb-hsblf\" (UID: \"eef8d65a-fa41-4368-8368-4b50935db576\") " pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.238202 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fh44q" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.323023 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.394580 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72a4a042-08eb-4644-81c0-2cfcd105cf2b-combined-ca-bundle\") pod \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\" (UID: \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\") " Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.394734 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72a4a042-08eb-4644-81c0-2cfcd105cf2b-scripts\") pod \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\" (UID: \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\") " Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.394800 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpq8g\" (UniqueName: \"kubernetes.io/projected/72a4a042-08eb-4644-81c0-2cfcd105cf2b-kube-api-access-xpq8g\") pod \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\" (UID: \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\") " Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.394840 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72a4a042-08eb-4644-81c0-2cfcd105cf2b-config-data\") pod \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\" (UID: \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\") " Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.394857 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72a4a042-08eb-4644-81c0-2cfcd105cf2b-logs\") pod \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\" (UID: \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\") " Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.395441 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72a4a042-08eb-4644-81c0-2cfcd105cf2b-logs" (OuterVolumeSpecName: "logs") pod "72a4a042-08eb-4644-81c0-2cfcd105cf2b" (UID: "72a4a042-08eb-4644-81c0-2cfcd105cf2b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.402935 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72a4a042-08eb-4644-81c0-2cfcd105cf2b-scripts" (OuterVolumeSpecName: "scripts") pod "72a4a042-08eb-4644-81c0-2cfcd105cf2b" (UID: "72a4a042-08eb-4644-81c0-2cfcd105cf2b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.413799 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72a4a042-08eb-4644-81c0-2cfcd105cf2b-kube-api-access-xpq8g" (OuterVolumeSpecName: "kube-api-access-xpq8g") pod "72a4a042-08eb-4644-81c0-2cfcd105cf2b" (UID: "72a4a042-08eb-4644-81c0-2cfcd105cf2b"). InnerVolumeSpecName "kube-api-access-xpq8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.431381 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72a4a042-08eb-4644-81c0-2cfcd105cf2b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72a4a042-08eb-4644-81c0-2cfcd105cf2b" (UID: "72a4a042-08eb-4644-81c0-2cfcd105cf2b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.449263 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72a4a042-08eb-4644-81c0-2cfcd105cf2b-config-data" (OuterVolumeSpecName: "config-data") pod "72a4a042-08eb-4644-81c0-2cfcd105cf2b" (UID: "72a4a042-08eb-4644-81c0-2cfcd105cf2b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.498459 4720 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72a4a042-08eb-4644-81c0-2cfcd105cf2b-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.498498 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpq8g\" (UniqueName: \"kubernetes.io/projected/72a4a042-08eb-4644-81c0-2cfcd105cf2b-kube-api-access-xpq8g\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.498510 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72a4a042-08eb-4644-81c0-2cfcd105cf2b-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.498520 4720 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72a4a042-08eb-4644-81c0-2cfcd105cf2b-logs\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.498531 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72a4a042-08eb-4644-81c0-2cfcd105cf2b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.517514 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-99cnk"] Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.521819 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fh44q" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.522071 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fh44q" event={"ID":"72a4a042-08eb-4644-81c0-2cfcd105cf2b","Type":"ContainerDied","Data":"45355f1df1e84e368508f9990990c5cc32fac5fefbdc4cf2346851c86675e699"} Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.522124 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45355f1df1e84e368508f9990990c5cc32fac5fefbdc4cf2346851c86675e699" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.414371 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-8648996d7d-4f2q4"] Jan 21 14:48:22 crc kubenswrapper[4720]: E0121 14:48:22.415771 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a4a042-08eb-4644-81c0-2cfcd105cf2b" containerName="placement-db-sync" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.415794 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a4a042-08eb-4644-81c0-2cfcd105cf2b" containerName="placement-db-sync" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.415940 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a4a042-08eb-4644-81c0-2cfcd105cf2b" containerName="placement-db-sync" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.416957 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.424543 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.424936 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.425107 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.425116 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.425258 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-7rq7c" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.430747 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8648996d7d-4f2q4"] Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.521078 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-config-data\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.521144 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-logs\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.521161 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-scripts\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.521176 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drwbw\" (UniqueName: \"kubernetes.io/projected/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-kube-api-access-drwbw\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.521209 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-internal-tls-certs\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.521231 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-public-tls-certs\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.521254 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-combined-ca-bundle\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.622132 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-combined-ca-bundle\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.622207 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-config-data\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.622258 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-logs\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.622278 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-scripts\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.622294 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drwbw\" (UniqueName: \"kubernetes.io/projected/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-kube-api-access-drwbw\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.622327 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-internal-tls-certs\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.622352 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-public-tls-certs\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.625911 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-logs\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.629922 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-config-data\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.630691 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-combined-ca-bundle\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.637020 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-scripts\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.641705 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-internal-tls-certs\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.649851 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drwbw\" (UniqueName: \"kubernetes.io/projected/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-kube-api-access-drwbw\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.653104 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-public-tls-certs\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.760764 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.212181 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6c8b4f85f7-4kz9x"] Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.213754 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.216977 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.217329 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.257347 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6c8b4f85f7-4kz9x"] Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.354849 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-config\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.354970 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-httpd-config\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.356020 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-ovndb-tls-certs\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.356398 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-combined-ca-bundle\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.356482 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw9bj\" (UniqueName: \"kubernetes.io/projected/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-kube-api-access-fw9bj\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.356513 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-internal-tls-certs\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.356563 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-public-tls-certs\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.457996 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-ovndb-tls-certs\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.458045 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-combined-ca-bundle\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.458073 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw9bj\" (UniqueName: \"kubernetes.io/projected/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-kube-api-access-fw9bj\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.458093 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-internal-tls-certs\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.458124 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-public-tls-certs\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.458143 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-config\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.458203 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-httpd-config\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.463951 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-ovndb-tls-certs\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.465626 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-combined-ca-bundle\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.468223 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-config\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.472937 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-public-tls-certs\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.475301 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-httpd-config\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.475581 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-internal-tls-certs\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.477546 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw9bj\" (UniqueName: \"kubernetes.io/projected/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-kube-api-access-fw9bj\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.535372 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:30 crc kubenswrapper[4720]: I0121 14:48:30.297621 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.372440 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-fernet-keys\") pod \"03e400cd-53d2-4738-96f0-75829e339879\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.372478 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-credential-keys\") pod \"03e400cd-53d2-4738-96f0-75829e339879\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.372520 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-config-data\") pod \"03e400cd-53d2-4738-96f0-75829e339879\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.372600 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnr6s\" (UniqueName: \"kubernetes.io/projected/03e400cd-53d2-4738-96f0-75829e339879-kube-api-access-mnr6s\") pod \"03e400cd-53d2-4738-96f0-75829e339879\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.372640 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-scripts\") pod \"03e400cd-53d2-4738-96f0-75829e339879\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.372692 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-combined-ca-bundle\") pod \"03e400cd-53d2-4738-96f0-75829e339879\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.389151 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03e400cd-53d2-4738-96f0-75829e339879-kube-api-access-mnr6s" (OuterVolumeSpecName: "kube-api-access-mnr6s") pod "03e400cd-53d2-4738-96f0-75829e339879" (UID: "03e400cd-53d2-4738-96f0-75829e339879"). InnerVolumeSpecName "kube-api-access-mnr6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.389220 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-scripts" (OuterVolumeSpecName: "scripts") pod "03e400cd-53d2-4738-96f0-75829e339879" (UID: "03e400cd-53d2-4738-96f0-75829e339879"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.389286 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "03e400cd-53d2-4738-96f0-75829e339879" (UID: "03e400cd-53d2-4738-96f0-75829e339879"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.389278 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "03e400cd-53d2-4738-96f0-75829e339879" (UID: "03e400cd-53d2-4738-96f0-75829e339879"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.438231 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-config-data" (OuterVolumeSpecName: "config-data") pod "03e400cd-53d2-4738-96f0-75829e339879" (UID: "03e400cd-53d2-4738-96f0-75829e339879"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.446786 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03e400cd-53d2-4738-96f0-75829e339879" (UID: "03e400cd-53d2-4738-96f0-75829e339879"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.474412 4720 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.474611 4720 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.474621 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.474630 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnr6s\" (UniqueName: \"kubernetes.io/projected/03e400cd-53d2-4738-96f0-75829e339879-kube-api-access-mnr6s\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.474639 4720 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.474646 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.607535 4720 generic.go:334] "Generic (PLEG): container finished" podID="c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01" containerID="6bd3c4506512bee2427c44ab8e73cd801e736b9aa463cb2377da3847b6955208" exitCode=0 Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.607586 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-99cnk" event={"ID":"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01","Type":"ContainerDied","Data":"6bd3c4506512bee2427c44ab8e73cd801e736b9aa463cb2377da3847b6955208"} Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.607609 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-99cnk" event={"ID":"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01","Type":"ContainerStarted","Data":"63bb1ec50feeba6e58d30594b8e6156c8d4ea90e8cd1ba58353f3003db3dc734"} Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.612357 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2","Type":"ContainerStarted","Data":"222f35b26b0728beb62ec61698da4f41418c6e39b7da204a27ccfb0b40fb8d13"} Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.614766 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lsn2k" event={"ID":"03e400cd-53d2-4738-96f0-75829e339879","Type":"ContainerDied","Data":"914afd90c63529f4374d55c95eb1d303cd6e2f2553422262b39f93a0e25feee8"} Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.614793 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.614795 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="914afd90c63529f4374d55c95eb1d303cd6e2f2553422262b39f93a0e25feee8" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.580565 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-69cc8766db-gdch7"] Jan 21 14:48:31 crc kubenswrapper[4720]: E0121 14:48:31.581143 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03e400cd-53d2-4738-96f0-75829e339879" containerName="keystone-bootstrap" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.581155 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="03e400cd-53d2-4738-96f0-75829e339879" containerName="keystone-bootstrap" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.581286 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="03e400cd-53d2-4738-96f0-75829e339879" containerName="keystone-bootstrap" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.581789 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.591366 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.591535 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pb4wb" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.592012 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.592113 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.592210 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.594069 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.634533 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-combined-ca-bundle\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.635523 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-config-data\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.635554 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-internal-tls-certs\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.635581 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw78t\" (UniqueName: \"kubernetes.io/projected/0edd5078-75bc-4823-b52f-ad5effeace06-kube-api-access-qw78t\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.635637 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-fernet-keys\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.635678 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-scripts\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.635696 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-credential-keys\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.635797 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-public-tls-certs\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.641704 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-69cc8766db-gdch7"] Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.652431 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7f78c5dfcb-hsblf"] Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.676637 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-99cnk" event={"ID":"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01","Type":"ContainerStarted","Data":"7ae7c07ef8890756b398637a11b0756cd3d10ee29c05e003e54c7ef091337410"} Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.677744 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.680148 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vz5k2" event={"ID":"d468a637-b18d-47fd-9b04-910dba72a955","Type":"ContainerStarted","Data":"da30657364957537118b3484996473e61293d2c96c58d296138cfcceba62bd38"} Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.704546 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8648996d7d-4f2q4"] Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.729521 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b946d459c-99cnk" podStartSLOduration=11.729507035 podStartE2EDuration="11.729507035s" podCreationTimestamp="2026-01-21 14:48:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:48:31.72928086 +0000 UTC m=+1149.638020802" watchObservedRunningTime="2026-01-21 14:48:31.729507035 +0000 UTC m=+1149.638246967" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.737635 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qw78t\" (UniqueName: \"kubernetes.io/projected/0edd5078-75bc-4823-b52f-ad5effeace06-kube-api-access-qw78t\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.737826 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-fernet-keys\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.737893 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-scripts\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.737944 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-credential-keys\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.737993 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-public-tls-certs\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.738158 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-combined-ca-bundle\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.738211 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-config-data\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.738249 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-internal-tls-certs\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.776068 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-scripts\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.776416 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-public-tls-certs\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.776462 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-credential-keys\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.784512 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-config-data\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.784533 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-fernet-keys\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.784876 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-internal-tls-certs\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.785482 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw78t\" (UniqueName: \"kubernetes.io/projected/0edd5078-75bc-4823-b52f-ad5effeace06-kube-api-access-qw78t\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.800301 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-combined-ca-bundle\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.812191 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-vz5k2" podStartSLOduration=3.603836878 podStartE2EDuration="38.812175719s" podCreationTimestamp="2026-01-21 14:47:53 +0000 UTC" firstStartedPulling="2026-01-21 14:47:55.096847243 +0000 UTC m=+1113.005587175" lastFinishedPulling="2026-01-21 14:48:30.305186084 +0000 UTC m=+1148.213926016" observedRunningTime="2026-01-21 14:48:31.785161092 +0000 UTC m=+1149.693901024" watchObservedRunningTime="2026-01-21 14:48:31.812175719 +0000 UTC m=+1149.720915651" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.927648 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6c8b4f85f7-4kz9x"] Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.055469 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.639257 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-69cc8766db-gdch7"] Jan 21 14:48:32 crc kubenswrapper[4720]: W0121 14:48:32.648043 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0edd5078_75bc_4823_b52f_ad5effeace06.slice/crio-cc1ef91ef553b9d691f09c291df3e53455ca32fa4f02406270b1b0f98eeb456b WatchSource:0}: Error finding container cc1ef91ef553b9d691f09c291df3e53455ca32fa4f02406270b1b0f98eeb456b: Status 404 returned error can't find the container with id cc1ef91ef553b9d691f09c291df3e53455ca32fa4f02406270b1b0f98eeb456b Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.698533 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8648996d7d-4f2q4" event={"ID":"37e9aac3-9710-4d1c-88a7-1a0a22b5a593","Type":"ContainerStarted","Data":"8237450b616f813ffba74d9888f3ca2e07afe776f646df699e7891cc5569f709"} Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.698804 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8648996d7d-4f2q4" event={"ID":"37e9aac3-9710-4d1c-88a7-1a0a22b5a593","Type":"ContainerStarted","Data":"0d503bdd2ab088d4b834ba24a297b1d2f6ddf36e67e8327ac4567dab605f8dc6"} Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.698814 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8648996d7d-4f2q4" event={"ID":"37e9aac3-9710-4d1c-88a7-1a0a22b5a593","Type":"ContainerStarted","Data":"25b1b940e7a3af70b2d71e8f9b26e1a866ed65c52a96efa0a6d7b8493de2ed4b"} Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.698845 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.698864 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.700627 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wtr5d" event={"ID":"2eaf7930-34cf-4396-9b94-c09d3a5da09a","Type":"ContainerStarted","Data":"aa36f5e3e3dbee78955e3cde60ca553a839782ae810aaa8a755ce96f2d298234"} Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.703359 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c8b4f85f7-4kz9x" event={"ID":"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7","Type":"ContainerStarted","Data":"25b8ba9f43d4e200b4217f3ee0a98bd5fc4bebff3320e0c0fe7a54f03c943bc4"} Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.703396 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c8b4f85f7-4kz9x" event={"ID":"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7","Type":"ContainerStarted","Data":"5e5dd967c7901c33d6bc8eceb49611162cac7ed45b2d8530b7a6e9383438987d"} Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.703407 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c8b4f85f7-4kz9x" event={"ID":"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7","Type":"ContainerStarted","Data":"232752c2f62e2e7c660628d602ace55a54c7920d10aa16bb08ffd899712eedf3"} Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.705452 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.710798 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f78c5dfcb-hsblf" event={"ID":"eef8d65a-fa41-4368-8368-4b50935db576","Type":"ContainerStarted","Data":"3feae91b582c3edfa398af2c8e492be810876839abf0579e7a49dcb2911c2b6e"} Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.710836 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f78c5dfcb-hsblf" event={"ID":"eef8d65a-fa41-4368-8368-4b50935db576","Type":"ContainerStarted","Data":"e74c40ab383c4e2bd435cfe7f1033fab586921ebe963904a94ff3569cafdccd3"} Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.710847 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f78c5dfcb-hsblf" event={"ID":"eef8d65a-fa41-4368-8368-4b50935db576","Type":"ContainerStarted","Data":"68237d054d159a4765c532ce618148ffe9f3f4fde0b73c4c9dd1d07a506b6603"} Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.711844 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.713041 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-69cc8766db-gdch7" event={"ID":"0edd5078-75bc-4823-b52f-ad5effeace06","Type":"ContainerStarted","Data":"cc1ef91ef553b9d691f09c291df3e53455ca32fa4f02406270b1b0f98eeb456b"} Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.816627 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6c8b4f85f7-4kz9x" podStartSLOduration=8.816611839 podStartE2EDuration="8.816611839s" podCreationTimestamp="2026-01-21 14:48:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:48:32.812119293 +0000 UTC m=+1150.720859235" watchObservedRunningTime="2026-01-21 14:48:32.816611839 +0000 UTC m=+1150.725351771" Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.840354 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7f78c5dfcb-hsblf" podStartSLOduration=12.840337542 podStartE2EDuration="12.840337542s" podCreationTimestamp="2026-01-21 14:48:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:48:32.838346621 +0000 UTC m=+1150.747086553" watchObservedRunningTime="2026-01-21 14:48:32.840337542 +0000 UTC m=+1150.749077474" Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.869056 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-wtr5d" podStartSLOduration=3.315160454 podStartE2EDuration="39.869040203s" podCreationTimestamp="2026-01-21 14:47:53 +0000 UTC" firstStartedPulling="2026-01-21 14:47:54.878501136 +0000 UTC m=+1112.787241068" lastFinishedPulling="2026-01-21 14:48:31.432380885 +0000 UTC m=+1149.341120817" observedRunningTime="2026-01-21 14:48:32.864065864 +0000 UTC m=+1150.772805796" watchObservedRunningTime="2026-01-21 14:48:32.869040203 +0000 UTC m=+1150.777780135" Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.894241 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-8648996d7d-4f2q4" podStartSLOduration=10.894224433 podStartE2EDuration="10.894224433s" podCreationTimestamp="2026-01-21 14:48:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:48:32.889980684 +0000 UTC m=+1150.798720626" watchObservedRunningTime="2026-01-21 14:48:32.894224433 +0000 UTC m=+1150.802964365" Jan 21 14:48:33 crc kubenswrapper[4720]: I0121 14:48:33.721246 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-69cc8766db-gdch7" event={"ID":"0edd5078-75bc-4823-b52f-ad5effeace06","Type":"ContainerStarted","Data":"4618462f6d880a57ca93707068b944d2fd4d41e47ea3695e54afec3affa91c60"} Jan 21 14:48:33 crc kubenswrapper[4720]: I0121 14:48:33.741274 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-69cc8766db-gdch7" podStartSLOduration=2.74125857 podStartE2EDuration="2.74125857s" podCreationTimestamp="2026-01-21 14:48:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:48:33.739026432 +0000 UTC m=+1151.647766364" watchObservedRunningTime="2026-01-21 14:48:33.74125857 +0000 UTC m=+1151.649998502" Jan 21 14:48:34 crc kubenswrapper[4720]: I0121 14:48:34.731780 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.009438 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.103679 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-fw4gh"] Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.104142 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" podUID="6a60b31b-eca6-4e2d-8dcd-0097033a8a35" containerName="dnsmasq-dns" containerID="cri-o://807889b31df7e2b2850227042ad536d3c2a4d3c7125a563f44e02b57c276cf17" gracePeriod=10 Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.556467 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.642315 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-ovsdbserver-nb\") pod \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\" (UID: \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\") " Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.642413 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-config\") pod \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\" (UID: \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\") " Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.642458 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bv6f\" (UniqueName: \"kubernetes.io/projected/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-kube-api-access-2bv6f\") pod \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\" (UID: \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\") " Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.642568 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-dns-svc\") pod \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\" (UID: \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\") " Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.642587 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-ovsdbserver-sb\") pod \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\" (UID: \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\") " Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.659980 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-kube-api-access-2bv6f" (OuterVolumeSpecName: "kube-api-access-2bv6f") pod "6a60b31b-eca6-4e2d-8dcd-0097033a8a35" (UID: "6a60b31b-eca6-4e2d-8dcd-0097033a8a35"). InnerVolumeSpecName "kube-api-access-2bv6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.690192 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6a60b31b-eca6-4e2d-8dcd-0097033a8a35" (UID: "6a60b31b-eca6-4e2d-8dcd-0097033a8a35"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.690894 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6a60b31b-eca6-4e2d-8dcd-0097033a8a35" (UID: "6a60b31b-eca6-4e2d-8dcd-0097033a8a35"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.700389 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6a60b31b-eca6-4e2d-8dcd-0097033a8a35" (UID: "6a60b31b-eca6-4e2d-8dcd-0097033a8a35"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.710785 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-config" (OuterVolumeSpecName: "config") pod "6a60b31b-eca6-4e2d-8dcd-0097033a8a35" (UID: "6a60b31b-eca6-4e2d-8dcd-0097033a8a35"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.744331 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.745444 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.745540 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bv6f\" (UniqueName: \"kubernetes.io/projected/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-kube-api-access-2bv6f\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.745602 4720 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.745674 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.745619 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wtr5d" event={"ID":"2eaf7930-34cf-4396-9b94-c09d3a5da09a","Type":"ContainerDied","Data":"aa36f5e3e3dbee78955e3cde60ca553a839782ae810aaa8a755ce96f2d298234"} Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.745598 4720 generic.go:334] "Generic (PLEG): container finished" podID="2eaf7930-34cf-4396-9b94-c09d3a5da09a" containerID="aa36f5e3e3dbee78955e3cde60ca553a839782ae810aaa8a755ce96f2d298234" exitCode=0 Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.749236 4720 generic.go:334] "Generic (PLEG): container finished" podID="6a60b31b-eca6-4e2d-8dcd-0097033a8a35" containerID="807889b31df7e2b2850227042ad536d3c2a4d3c7125a563f44e02b57c276cf17" exitCode=0 Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.749274 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" event={"ID":"6a60b31b-eca6-4e2d-8dcd-0097033a8a35","Type":"ContainerDied","Data":"807889b31df7e2b2850227042ad536d3c2a4d3c7125a563f44e02b57c276cf17"} Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.749299 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" event={"ID":"6a60b31b-eca6-4e2d-8dcd-0097033a8a35","Type":"ContainerDied","Data":"f174f6fdc28ef67dfde4e4bef9624d4a07461dcbbd47420bdd6d732525e7403e"} Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.749316 4720 scope.go:117] "RemoveContainer" containerID="807889b31df7e2b2850227042ad536d3c2a4d3c7125a563f44e02b57c276cf17" Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.749434 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.792149 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-fw4gh"] Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.799636 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-fw4gh"] Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.800454 4720 scope.go:117] "RemoveContainer" containerID="e46f112e6ec1160fcf32ab53a073a80fc27df442931a4ca3b0306e8fe8981868" Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.830260 4720 scope.go:117] "RemoveContainer" containerID="807889b31df7e2b2850227042ad536d3c2a4d3c7125a563f44e02b57c276cf17" Jan 21 14:48:36 crc kubenswrapper[4720]: E0121 14:48:36.830861 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"807889b31df7e2b2850227042ad536d3c2a4d3c7125a563f44e02b57c276cf17\": container with ID starting with 807889b31df7e2b2850227042ad536d3c2a4d3c7125a563f44e02b57c276cf17 not found: ID does not exist" containerID="807889b31df7e2b2850227042ad536d3c2a4d3c7125a563f44e02b57c276cf17" Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.830908 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"807889b31df7e2b2850227042ad536d3c2a4d3c7125a563f44e02b57c276cf17"} err="failed to get container status \"807889b31df7e2b2850227042ad536d3c2a4d3c7125a563f44e02b57c276cf17\": rpc error: code = NotFound desc = could not find container \"807889b31df7e2b2850227042ad536d3c2a4d3c7125a563f44e02b57c276cf17\": container with ID starting with 807889b31df7e2b2850227042ad536d3c2a4d3c7125a563f44e02b57c276cf17 not found: ID does not exist" Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.830934 4720 scope.go:117] "RemoveContainer" containerID="e46f112e6ec1160fcf32ab53a073a80fc27df442931a4ca3b0306e8fe8981868" Jan 21 14:48:36 crc kubenswrapper[4720]: E0121 14:48:36.831613 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e46f112e6ec1160fcf32ab53a073a80fc27df442931a4ca3b0306e8fe8981868\": container with ID starting with e46f112e6ec1160fcf32ab53a073a80fc27df442931a4ca3b0306e8fe8981868 not found: ID does not exist" containerID="e46f112e6ec1160fcf32ab53a073a80fc27df442931a4ca3b0306e8fe8981868" Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.831681 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e46f112e6ec1160fcf32ab53a073a80fc27df442931a4ca3b0306e8fe8981868"} err="failed to get container status \"e46f112e6ec1160fcf32ab53a073a80fc27df442931a4ca3b0306e8fe8981868\": rpc error: code = NotFound desc = could not find container \"e46f112e6ec1160fcf32ab53a073a80fc27df442931a4ca3b0306e8fe8981868\": container with ID starting with e46f112e6ec1160fcf32ab53a073a80fc27df442931a4ca3b0306e8fe8981868 not found: ID does not exist" Jan 21 14:48:37 crc kubenswrapper[4720]: I0121 14:48:37.758910 4720 generic.go:334] "Generic (PLEG): container finished" podID="d468a637-b18d-47fd-9b04-910dba72a955" containerID="da30657364957537118b3484996473e61293d2c96c58d296138cfcceba62bd38" exitCode=0 Jan 21 14:48:37 crc kubenswrapper[4720]: I0121 14:48:37.758981 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vz5k2" event={"ID":"d468a637-b18d-47fd-9b04-910dba72a955","Type":"ContainerDied","Data":"da30657364957537118b3484996473e61293d2c96c58d296138cfcceba62bd38"} Jan 21 14:48:38 crc kubenswrapper[4720]: I0121 14:48:38.099999 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wtr5d" Jan 21 14:48:38 crc kubenswrapper[4720]: I0121 14:48:38.201449 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eaf7930-34cf-4396-9b94-c09d3a5da09a-combined-ca-bundle\") pod \"2eaf7930-34cf-4396-9b94-c09d3a5da09a\" (UID: \"2eaf7930-34cf-4396-9b94-c09d3a5da09a\") " Jan 21 14:48:38 crc kubenswrapper[4720]: I0121 14:48:38.201754 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqjpm\" (UniqueName: \"kubernetes.io/projected/2eaf7930-34cf-4396-9b94-c09d3a5da09a-kube-api-access-sqjpm\") pod \"2eaf7930-34cf-4396-9b94-c09d3a5da09a\" (UID: \"2eaf7930-34cf-4396-9b94-c09d3a5da09a\") " Jan 21 14:48:38 crc kubenswrapper[4720]: I0121 14:48:38.201775 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2eaf7930-34cf-4396-9b94-c09d3a5da09a-db-sync-config-data\") pod \"2eaf7930-34cf-4396-9b94-c09d3a5da09a\" (UID: \"2eaf7930-34cf-4396-9b94-c09d3a5da09a\") " Jan 21 14:48:38 crc kubenswrapper[4720]: I0121 14:48:38.209882 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2eaf7930-34cf-4396-9b94-c09d3a5da09a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2eaf7930-34cf-4396-9b94-c09d3a5da09a" (UID: "2eaf7930-34cf-4396-9b94-c09d3a5da09a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:38 crc kubenswrapper[4720]: I0121 14:48:38.210856 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2eaf7930-34cf-4396-9b94-c09d3a5da09a-kube-api-access-sqjpm" (OuterVolumeSpecName: "kube-api-access-sqjpm") pod "2eaf7930-34cf-4396-9b94-c09d3a5da09a" (UID: "2eaf7930-34cf-4396-9b94-c09d3a5da09a"). InnerVolumeSpecName "kube-api-access-sqjpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:48:38 crc kubenswrapper[4720]: I0121 14:48:38.234917 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2eaf7930-34cf-4396-9b94-c09d3a5da09a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2eaf7930-34cf-4396-9b94-c09d3a5da09a" (UID: "2eaf7930-34cf-4396-9b94-c09d3a5da09a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:38 crc kubenswrapper[4720]: I0121 14:48:38.303384 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqjpm\" (UniqueName: \"kubernetes.io/projected/2eaf7930-34cf-4396-9b94-c09d3a5da09a-kube-api-access-sqjpm\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:38 crc kubenswrapper[4720]: I0121 14:48:38.303648 4720 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2eaf7930-34cf-4396-9b94-c09d3a5da09a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:38 crc kubenswrapper[4720]: I0121 14:48:38.303679 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eaf7930-34cf-4396-9b94-c09d3a5da09a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:38 crc kubenswrapper[4720]: I0121 14:48:38.694024 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a60b31b-eca6-4e2d-8dcd-0097033a8a35" path="/var/lib/kubelet/pods/6a60b31b-eca6-4e2d-8dcd-0097033a8a35/volumes" Jan 21 14:48:38 crc kubenswrapper[4720]: I0121 14:48:38.770849 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wtr5d" Jan 21 14:48:38 crc kubenswrapper[4720]: I0121 14:48:38.770846 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wtr5d" event={"ID":"2eaf7930-34cf-4396-9b94-c09d3a5da09a","Type":"ContainerDied","Data":"2b7bc7ab0da041d56c2df3e5b44877c1c08f4a7afb2d85c0210b22ee47e43e82"} Jan 21 14:48:38 crc kubenswrapper[4720]: I0121 14:48:38.770889 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b7bc7ab0da041d56c2df3e5b44877c1c08f4a7afb2d85c0210b22ee47e43e82" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.062506 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-8f88c9d47-m5rzn"] Jan 21 14:48:39 crc kubenswrapper[4720]: E0121 14:48:39.062775 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eaf7930-34cf-4396-9b94-c09d3a5da09a" containerName="barbican-db-sync" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.062787 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eaf7930-34cf-4396-9b94-c09d3a5da09a" containerName="barbican-db-sync" Jan 21 14:48:39 crc kubenswrapper[4720]: E0121 14:48:39.062805 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a60b31b-eca6-4e2d-8dcd-0097033a8a35" containerName="init" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.062811 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a60b31b-eca6-4e2d-8dcd-0097033a8a35" containerName="init" Jan 21 14:48:39 crc kubenswrapper[4720]: E0121 14:48:39.062829 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a60b31b-eca6-4e2d-8dcd-0097033a8a35" containerName="dnsmasq-dns" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.062835 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a60b31b-eca6-4e2d-8dcd-0097033a8a35" containerName="dnsmasq-dns" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.063665 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="2eaf7930-34cf-4396-9b94-c09d3a5da09a" containerName="barbican-db-sync" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.063685 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a60b31b-eca6-4e2d-8dcd-0097033a8a35" containerName="dnsmasq-dns" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.064438 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-8f88c9d47-m5rzn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.067159 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.067430 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.067531 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-9pjf9" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.131367 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.158432 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6898c4b994-dn9qn"] Jan 21 14:48:39 crc kubenswrapper[4720]: E0121 14:48:39.179783 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d468a637-b18d-47fd-9b04-910dba72a955" containerName="cinder-db-sync" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.179818 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="d468a637-b18d-47fd-9b04-910dba72a955" containerName="cinder-db-sync" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.180106 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="d468a637-b18d-47fd-9b04-910dba72a955" containerName="cinder-db-sync" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.180982 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.197298 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-8f88c9d47-m5rzn"] Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.197537 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.209056 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6898c4b994-dn9qn"] Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.221708 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtmsn\" (UniqueName: \"kubernetes.io/projected/9355d502-bf01-4465-996d-483d99b92954-kube-api-access-dtmsn\") pod \"barbican-worker-8f88c9d47-m5rzn\" (UID: \"9355d502-bf01-4465-996d-483d99b92954\") " pod="openstack/barbican-worker-8f88c9d47-m5rzn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.221784 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9355d502-bf01-4465-996d-483d99b92954-combined-ca-bundle\") pod \"barbican-worker-8f88c9d47-m5rzn\" (UID: \"9355d502-bf01-4465-996d-483d99b92954\") " pod="openstack/barbican-worker-8f88c9d47-m5rzn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.221804 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9355d502-bf01-4465-996d-483d99b92954-config-data-custom\") pod \"barbican-worker-8f88c9d47-m5rzn\" (UID: \"9355d502-bf01-4465-996d-483d99b92954\") " pod="openstack/barbican-worker-8f88c9d47-m5rzn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.221820 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9355d502-bf01-4465-996d-483d99b92954-config-data\") pod \"barbican-worker-8f88c9d47-m5rzn\" (UID: \"9355d502-bf01-4465-996d-483d99b92954\") " pod="openstack/barbican-worker-8f88c9d47-m5rzn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.221891 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9355d502-bf01-4465-996d-483d99b92954-logs\") pod \"barbican-worker-8f88c9d47-m5rzn\" (UID: \"9355d502-bf01-4465-996d-483d99b92954\") " pod="openstack/barbican-worker-8f88c9d47-m5rzn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.245052 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-qtbjz"] Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.270236 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-qtbjz"] Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.270425 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.323720 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-db-sync-config-data\") pod \"d468a637-b18d-47fd-9b04-910dba72a955\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.323923 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85gwv\" (UniqueName: \"kubernetes.io/projected/d468a637-b18d-47fd-9b04-910dba72a955-kube-api-access-85gwv\") pod \"d468a637-b18d-47fd-9b04-910dba72a955\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.323952 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-config-data\") pod \"d468a637-b18d-47fd-9b04-910dba72a955\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.324017 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-scripts\") pod \"d468a637-b18d-47fd-9b04-910dba72a955\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.324047 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d468a637-b18d-47fd-9b04-910dba72a955-etc-machine-id\") pod \"d468a637-b18d-47fd-9b04-910dba72a955\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.324122 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-combined-ca-bundle\") pod \"d468a637-b18d-47fd-9b04-910dba72a955\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.327235 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb475766-6891-454b-8f7e-1494d9806891-config-data-custom\") pod \"barbican-keystone-listener-6898c4b994-dn9qn\" (UID: \"bb475766-6891-454b-8f7e-1494d9806891\") " pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.327302 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwqd9\" (UniqueName: \"kubernetes.io/projected/bb475766-6891-454b-8f7e-1494d9806891-kube-api-access-fwqd9\") pod \"barbican-keystone-listener-6898c4b994-dn9qn\" (UID: \"bb475766-6891-454b-8f7e-1494d9806891\") " pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.327321 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb475766-6891-454b-8f7e-1494d9806891-config-data\") pod \"barbican-keystone-listener-6898c4b994-dn9qn\" (UID: \"bb475766-6891-454b-8f7e-1494d9806891\") " pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.327358 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb475766-6891-454b-8f7e-1494d9806891-logs\") pod \"barbican-keystone-listener-6898c4b994-dn9qn\" (UID: \"bb475766-6891-454b-8f7e-1494d9806891\") " pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.327382 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtmsn\" (UniqueName: \"kubernetes.io/projected/9355d502-bf01-4465-996d-483d99b92954-kube-api-access-dtmsn\") pod \"barbican-worker-8f88c9d47-m5rzn\" (UID: \"9355d502-bf01-4465-996d-483d99b92954\") " pod="openstack/barbican-worker-8f88c9d47-m5rzn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.327491 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9355d502-bf01-4465-996d-483d99b92954-combined-ca-bundle\") pod \"barbican-worker-8f88c9d47-m5rzn\" (UID: \"9355d502-bf01-4465-996d-483d99b92954\") " pod="openstack/barbican-worker-8f88c9d47-m5rzn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.327509 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9355d502-bf01-4465-996d-483d99b92954-config-data-custom\") pod \"barbican-worker-8f88c9d47-m5rzn\" (UID: \"9355d502-bf01-4465-996d-483d99b92954\") " pod="openstack/barbican-worker-8f88c9d47-m5rzn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.327529 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9355d502-bf01-4465-996d-483d99b92954-config-data\") pod \"barbican-worker-8f88c9d47-m5rzn\" (UID: \"9355d502-bf01-4465-996d-483d99b92954\") " pod="openstack/barbican-worker-8f88c9d47-m5rzn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.327638 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb475766-6891-454b-8f7e-1494d9806891-combined-ca-bundle\") pod \"barbican-keystone-listener-6898c4b994-dn9qn\" (UID: \"bb475766-6891-454b-8f7e-1494d9806891\") " pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.327698 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9355d502-bf01-4465-996d-483d99b92954-logs\") pod \"barbican-worker-8f88c9d47-m5rzn\" (UID: \"9355d502-bf01-4465-996d-483d99b92954\") " pod="openstack/barbican-worker-8f88c9d47-m5rzn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.328197 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9355d502-bf01-4465-996d-483d99b92954-logs\") pod \"barbican-worker-8f88c9d47-m5rzn\" (UID: \"9355d502-bf01-4465-996d-483d99b92954\") " pod="openstack/barbican-worker-8f88c9d47-m5rzn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.332834 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d468a637-b18d-47fd-9b04-910dba72a955-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d468a637-b18d-47fd-9b04-910dba72a955" (UID: "d468a637-b18d-47fd-9b04-910dba72a955"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.338825 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-scripts" (OuterVolumeSpecName: "scripts") pod "d468a637-b18d-47fd-9b04-910dba72a955" (UID: "d468a637-b18d-47fd-9b04-910dba72a955"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.339900 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9355d502-bf01-4465-996d-483d99b92954-combined-ca-bundle\") pod \"barbican-worker-8f88c9d47-m5rzn\" (UID: \"9355d502-bf01-4465-996d-483d99b92954\") " pod="openstack/barbican-worker-8f88c9d47-m5rzn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.343826 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d468a637-b18d-47fd-9b04-910dba72a955-kube-api-access-85gwv" (OuterVolumeSpecName: "kube-api-access-85gwv") pod "d468a637-b18d-47fd-9b04-910dba72a955" (UID: "d468a637-b18d-47fd-9b04-910dba72a955"). InnerVolumeSpecName "kube-api-access-85gwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.348998 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d468a637-b18d-47fd-9b04-910dba72a955" (UID: "d468a637-b18d-47fd-9b04-910dba72a955"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.349035 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9355d502-bf01-4465-996d-483d99b92954-config-data\") pod \"barbican-worker-8f88c9d47-m5rzn\" (UID: \"9355d502-bf01-4465-996d-483d99b92954\") " pod="openstack/barbican-worker-8f88c9d47-m5rzn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.362382 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9355d502-bf01-4465-996d-483d99b92954-config-data-custom\") pod \"barbican-worker-8f88c9d47-m5rzn\" (UID: \"9355d502-bf01-4465-996d-483d99b92954\") " pod="openstack/barbican-worker-8f88c9d47-m5rzn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.387480 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7c6979b468-whx5j"] Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.388646 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.396076 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.408171 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtmsn\" (UniqueName: \"kubernetes.io/projected/9355d502-bf01-4465-996d-483d99b92954-kube-api-access-dtmsn\") pod \"barbican-worker-8f88c9d47-m5rzn\" (UID: \"9355d502-bf01-4465-996d-483d99b92954\") " pod="openstack/barbican-worker-8f88c9d47-m5rzn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.417501 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d468a637-b18d-47fd-9b04-910dba72a955" (UID: "d468a637-b18d-47fd-9b04-910dba72a955"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.429142 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-qtbjz\" (UID: \"4775aea1-f465-4995-a37a-1285ed8229dd\") " pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.429196 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb475766-6891-454b-8f7e-1494d9806891-config-data-custom\") pod \"barbican-keystone-listener-6898c4b994-dn9qn\" (UID: \"bb475766-6891-454b-8f7e-1494d9806891\") " pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.429217 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-config\") pod \"dnsmasq-dns-6bb684768f-qtbjz\" (UID: \"4775aea1-f465-4995-a37a-1285ed8229dd\") " pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.429234 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb475766-6891-454b-8f7e-1494d9806891-config-data\") pod \"barbican-keystone-listener-6898c4b994-dn9qn\" (UID: \"bb475766-6891-454b-8f7e-1494d9806891\") " pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.429250 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwqd9\" (UniqueName: \"kubernetes.io/projected/bb475766-6891-454b-8f7e-1494d9806891-kube-api-access-fwqd9\") pod \"barbican-keystone-listener-6898c4b994-dn9qn\" (UID: \"bb475766-6891-454b-8f7e-1494d9806891\") " pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.429274 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-qtbjz\" (UID: \"4775aea1-f465-4995-a37a-1285ed8229dd\") " pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.429288 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb475766-6891-454b-8f7e-1494d9806891-logs\") pod \"barbican-keystone-listener-6898c4b994-dn9qn\" (UID: \"bb475766-6891-454b-8f7e-1494d9806891\") " pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.429338 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-dns-svc\") pod \"dnsmasq-dns-6bb684768f-qtbjz\" (UID: \"4775aea1-f465-4995-a37a-1285ed8229dd\") " pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.429363 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtkhq\" (UniqueName: \"kubernetes.io/projected/4775aea1-f465-4995-a37a-1285ed8229dd-kube-api-access-gtkhq\") pod \"dnsmasq-dns-6bb684768f-qtbjz\" (UID: \"4775aea1-f465-4995-a37a-1285ed8229dd\") " pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.429405 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb475766-6891-454b-8f7e-1494d9806891-combined-ca-bundle\") pod \"barbican-keystone-listener-6898c4b994-dn9qn\" (UID: \"bb475766-6891-454b-8f7e-1494d9806891\") " pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.429450 4720 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.429460 4720 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d468a637-b18d-47fd-9b04-910dba72a955-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.429469 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.429478 4720 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.429486 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85gwv\" (UniqueName: \"kubernetes.io/projected/d468a637-b18d-47fd-9b04-910dba72a955-kube-api-access-85gwv\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.433135 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb475766-6891-454b-8f7e-1494d9806891-combined-ca-bundle\") pod \"barbican-keystone-listener-6898c4b994-dn9qn\" (UID: \"bb475766-6891-454b-8f7e-1494d9806891\") " pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.433404 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb475766-6891-454b-8f7e-1494d9806891-logs\") pod \"barbican-keystone-listener-6898c4b994-dn9qn\" (UID: \"bb475766-6891-454b-8f7e-1494d9806891\") " pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.443575 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-8f88c9d47-m5rzn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.448315 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb475766-6891-454b-8f7e-1494d9806891-config-data-custom\") pod \"barbican-keystone-listener-6898c4b994-dn9qn\" (UID: \"bb475766-6891-454b-8f7e-1494d9806891\") " pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.460383 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-config-data" (OuterVolumeSpecName: "config-data") pod "d468a637-b18d-47fd-9b04-910dba72a955" (UID: "d468a637-b18d-47fd-9b04-910dba72a955"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.475018 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb475766-6891-454b-8f7e-1494d9806891-config-data\") pod \"barbican-keystone-listener-6898c4b994-dn9qn\" (UID: \"bb475766-6891-454b-8f7e-1494d9806891\") " pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.480722 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7c6979b468-whx5j"] Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.514457 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwqd9\" (UniqueName: \"kubernetes.io/projected/bb475766-6891-454b-8f7e-1494d9806891-kube-api-access-fwqd9\") pod \"barbican-keystone-listener-6898c4b994-dn9qn\" (UID: \"bb475766-6891-454b-8f7e-1494d9806891\") " pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.524056 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.530921 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtkhq\" (UniqueName: \"kubernetes.io/projected/4775aea1-f465-4995-a37a-1285ed8229dd-kube-api-access-gtkhq\") pod \"dnsmasq-dns-6bb684768f-qtbjz\" (UID: \"4775aea1-f465-4995-a37a-1285ed8229dd\") " pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.535805 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-combined-ca-bundle\") pod \"barbican-api-7c6979b468-whx5j\" (UID: \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\") " pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.535982 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-logs\") pod \"barbican-api-7c6979b468-whx5j\" (UID: \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\") " pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.541110 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-config-data-custom\") pod \"barbican-api-7c6979b468-whx5j\" (UID: \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\") " pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.541343 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzc7n\" (UniqueName: \"kubernetes.io/projected/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-kube-api-access-hzc7n\") pod \"barbican-api-7c6979b468-whx5j\" (UID: \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\") " pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.541447 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-qtbjz\" (UID: \"4775aea1-f465-4995-a37a-1285ed8229dd\") " pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.541540 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-config-data\") pod \"barbican-api-7c6979b468-whx5j\" (UID: \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\") " pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.541637 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-config\") pod \"dnsmasq-dns-6bb684768f-qtbjz\" (UID: \"4775aea1-f465-4995-a37a-1285ed8229dd\") " pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.541762 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-qtbjz\" (UID: \"4775aea1-f465-4995-a37a-1285ed8229dd\") " pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.542128 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-dns-svc\") pod \"dnsmasq-dns-6bb684768f-qtbjz\" (UID: \"4775aea1-f465-4995-a37a-1285ed8229dd\") " pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.542333 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.543358 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-qtbjz\" (UID: \"4775aea1-f465-4995-a37a-1285ed8229dd\") " pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.544199 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-config\") pod \"dnsmasq-dns-6bb684768f-qtbjz\" (UID: \"4775aea1-f465-4995-a37a-1285ed8229dd\") " pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.544271 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-qtbjz\" (UID: \"4775aea1-f465-4995-a37a-1285ed8229dd\") " pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.546013 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-dns-svc\") pod \"dnsmasq-dns-6bb684768f-qtbjz\" (UID: \"4775aea1-f465-4995-a37a-1285ed8229dd\") " pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.557455 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtkhq\" (UniqueName: \"kubernetes.io/projected/4775aea1-f465-4995-a37a-1285ed8229dd-kube-api-access-gtkhq\") pod \"dnsmasq-dns-6bb684768f-qtbjz\" (UID: \"4775aea1-f465-4995-a37a-1285ed8229dd\") " pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.594981 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.643584 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-config-data\") pod \"barbican-api-7c6979b468-whx5j\" (UID: \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\") " pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.644031 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-logs\") pod \"barbican-api-7c6979b468-whx5j\" (UID: \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\") " pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.644077 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-combined-ca-bundle\") pod \"barbican-api-7c6979b468-whx5j\" (UID: \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\") " pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.644103 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-config-data-custom\") pod \"barbican-api-7c6979b468-whx5j\" (UID: \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\") " pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.644187 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzc7n\" (UniqueName: \"kubernetes.io/projected/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-kube-api-access-hzc7n\") pod \"barbican-api-7c6979b468-whx5j\" (UID: \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\") " pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.644825 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-logs\") pod \"barbican-api-7c6979b468-whx5j\" (UID: \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\") " pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.648185 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-combined-ca-bundle\") pod \"barbican-api-7c6979b468-whx5j\" (UID: \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\") " pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.651326 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-config-data-custom\") pod \"barbican-api-7c6979b468-whx5j\" (UID: \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\") " pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.661234 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-config-data\") pod \"barbican-api-7c6979b468-whx5j\" (UID: \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\") " pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.773000 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzc7n\" (UniqueName: \"kubernetes.io/projected/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-kube-api-access-hzc7n\") pod \"barbican-api-7c6979b468-whx5j\" (UID: \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\") " pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.845113 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.852923 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vz5k2" event={"ID":"d468a637-b18d-47fd-9b04-910dba72a955","Type":"ContainerDied","Data":"acf7d0f328c5178aaf28b6696e5f846b6ac605dd876f51d9605333f5abd8e705"} Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.852961 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acf7d0f328c5178aaf28b6696e5f846b6ac605dd876f51d9605333f5abd8e705" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.853061 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.989934 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.000446 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.013596 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.013816 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-r7487" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.013948 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.014068 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.041679 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.110789 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-8f88c9d47-m5rzn"] Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.166069 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn2ct\" (UniqueName: \"kubernetes.io/projected/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-kube-api-access-bn2ct\") pod \"cinder-scheduler-0\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " pod="openstack/cinder-scheduler-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.166414 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " pod="openstack/cinder-scheduler-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.166463 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-config-data\") pod \"cinder-scheduler-0\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " pod="openstack/cinder-scheduler-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.166500 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-scripts\") pod \"cinder-scheduler-0\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " pod="openstack/cinder-scheduler-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.166528 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " pod="openstack/cinder-scheduler-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.166607 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " pod="openstack/cinder-scheduler-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.233788 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-qtbjz"] Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.256757 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6898c4b994-dn9qn"] Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.268548 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn2ct\" (UniqueName: \"kubernetes.io/projected/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-kube-api-access-bn2ct\") pod \"cinder-scheduler-0\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " pod="openstack/cinder-scheduler-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.268612 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " pod="openstack/cinder-scheduler-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.268674 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-config-data\") pod \"cinder-scheduler-0\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " pod="openstack/cinder-scheduler-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.268720 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-scripts\") pod \"cinder-scheduler-0\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " pod="openstack/cinder-scheduler-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.268774 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " pod="openstack/cinder-scheduler-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.268830 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " pod="openstack/cinder-scheduler-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.269995 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " pod="openstack/cinder-scheduler-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.272813 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-7f22v"] Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.274416 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.285802 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " pod="openstack/cinder-scheduler-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.286323 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-7f22v"] Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.290284 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " pod="openstack/cinder-scheduler-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.290582 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-scripts\") pod \"cinder-scheduler-0\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " pod="openstack/cinder-scheduler-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.290969 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-config-data\") pod \"cinder-scheduler-0\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " pod="openstack/cinder-scheduler-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.309398 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn2ct\" (UniqueName: \"kubernetes.io/projected/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-kube-api-access-bn2ct\") pod \"cinder-scheduler-0\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " pod="openstack/cinder-scheduler-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.379136 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.492094 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-config\") pod \"dnsmasq-dns-6d97fcdd8f-7f22v\" (UID: \"2a0b57dc-517a-404a-a47d-1f86009fad51\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.492361 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-7f22v\" (UID: \"2a0b57dc-517a-404a-a47d-1f86009fad51\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.492483 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-7f22v\" (UID: \"2a0b57dc-517a-404a-a47d-1f86009fad51\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.492523 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-7f22v\" (UID: \"2a0b57dc-517a-404a-a47d-1f86009fad51\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.492562 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bww8\" (UniqueName: \"kubernetes.io/projected/2a0b57dc-517a-404a-a47d-1f86009fad51-kube-api-access-8bww8\") pod \"dnsmasq-dns-6d97fcdd8f-7f22v\" (UID: \"2a0b57dc-517a-404a-a47d-1f86009fad51\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.560037 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.572410 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.572465 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.578953 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.593736 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-7f22v\" (UID: \"2a0b57dc-517a-404a-a47d-1f86009fad51\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.593793 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-7f22v\" (UID: \"2a0b57dc-517a-404a-a47d-1f86009fad51\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.593827 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bww8\" (UniqueName: \"kubernetes.io/projected/2a0b57dc-517a-404a-a47d-1f86009fad51-kube-api-access-8bww8\") pod \"dnsmasq-dns-6d97fcdd8f-7f22v\" (UID: \"2a0b57dc-517a-404a-a47d-1f86009fad51\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.593874 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-config\") pod \"dnsmasq-dns-6d97fcdd8f-7f22v\" (UID: \"2a0b57dc-517a-404a-a47d-1f86009fad51\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.593899 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-7f22v\" (UID: \"2a0b57dc-517a-404a-a47d-1f86009fad51\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.594796 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-7f22v\" (UID: \"2a0b57dc-517a-404a-a47d-1f86009fad51\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.595318 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-config\") pod \"dnsmasq-dns-6d97fcdd8f-7f22v\" (UID: \"2a0b57dc-517a-404a-a47d-1f86009fad51\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.595701 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-7f22v\" (UID: \"2a0b57dc-517a-404a-a47d-1f86009fad51\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.595843 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-7f22v\" (UID: \"2a0b57dc-517a-404a-a47d-1f86009fad51\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.657303 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bww8\" (UniqueName: \"kubernetes.io/projected/2a0b57dc-517a-404a-a47d-1f86009fad51-kube-api-access-8bww8\") pod \"dnsmasq-dns-6d97fcdd8f-7f22v\" (UID: \"2a0b57dc-517a-404a-a47d-1f86009fad51\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.729824 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-scripts\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.729876 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9647cb32-4c23-445c-a66b-c71439bf617d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.729900 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.729979 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-config-data-custom\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.730048 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-config-data\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.730066 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9647cb32-4c23-445c-a66b-c71439bf617d-logs\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.730091 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88pl4\" (UniqueName: \"kubernetes.io/projected/9647cb32-4c23-445c-a66b-c71439bf617d-kube-api-access-88pl4\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.806201 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.823438 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-qtbjz"] Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.833070 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-config-data-custom\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.834145 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-config-data\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.834216 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9647cb32-4c23-445c-a66b-c71439bf617d-logs\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.834291 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88pl4\" (UniqueName: \"kubernetes.io/projected/9647cb32-4c23-445c-a66b-c71439bf617d-kube-api-access-88pl4\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.834360 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-scripts\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.834451 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9647cb32-4c23-445c-a66b-c71439bf617d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.834517 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.836673 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9647cb32-4c23-445c-a66b-c71439bf617d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.842103 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9647cb32-4c23-445c-a66b-c71439bf617d-logs\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.852293 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-scripts\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.853012 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-config-data-custom\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.860827 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-config-data\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.877473 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.903916 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-8f88c9d47-m5rzn" event={"ID":"9355d502-bf01-4465-996d-483d99b92954","Type":"ContainerStarted","Data":"dc9243e8f48bab6e285da615fb644a56108b7aafede7cddffde36655963464f8"} Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.909570 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" event={"ID":"bb475766-6891-454b-8f7e-1494d9806891","Type":"ContainerStarted","Data":"000a5953f4f4acbb293f3fd20f9c4128f67fc3aedf2d6b521a90bf43134a33df"} Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.916398 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" event={"ID":"4775aea1-f465-4995-a37a-1285ed8229dd","Type":"ContainerStarted","Data":"c620d8e3e9bc430960eba755b5a007861ddf15b09ef6b966ea58b1b2e0f572a3"} Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.936383 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88pl4\" (UniqueName: \"kubernetes.io/projected/9647cb32-4c23-445c-a66b-c71439bf617d-kube-api-access-88pl4\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:41 crc kubenswrapper[4720]: I0121 14:48:41.153429 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7c6979b468-whx5j"] Jan 21 14:48:41 crc kubenswrapper[4720]: I0121 14:48:41.204889 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 21 14:48:41 crc kubenswrapper[4720]: I0121 14:48:41.648782 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 21 14:48:41 crc kubenswrapper[4720]: W0121 14:48:41.686785 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c32ba08_0c9c_4f0a_b9f4_f56e91ee566e.slice/crio-90a6eb13a4e9101f155273acf68ce13d955b44911e5a69b94d712c8ecc3ebd49 WatchSource:0}: Error finding container 90a6eb13a4e9101f155273acf68ce13d955b44911e5a69b94d712c8ecc3ebd49: Status 404 returned error can't find the container with id 90a6eb13a4e9101f155273acf68ce13d955b44911e5a69b94d712c8ecc3ebd49 Jan 21 14:48:41 crc kubenswrapper[4720]: I0121 14:48:41.844578 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-7f22v"] Jan 21 14:48:41 crc kubenswrapper[4720]: I0121 14:48:41.863113 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 21 14:48:41 crc kubenswrapper[4720]: I0121 14:48:41.961592 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e","Type":"ContainerStarted","Data":"90a6eb13a4e9101f155273acf68ce13d955b44911e5a69b94d712c8ecc3ebd49"} Jan 21 14:48:41 crc kubenswrapper[4720]: I0121 14:48:41.975551 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9647cb32-4c23-445c-a66b-c71439bf617d","Type":"ContainerStarted","Data":"88ba151a7faa9c14ce1ffb6ea4af84a9781883a8fd8016a1bc953df79c7742c2"} Jan 21 14:48:41 crc kubenswrapper[4720]: I0121 14:48:41.977019 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" event={"ID":"2a0b57dc-517a-404a-a47d-1f86009fad51","Type":"ContainerStarted","Data":"c28c627632464fe77ab3019eb5addeb203e6d652bffef45ba63964d4aabbdd0c"} Jan 21 14:48:41 crc kubenswrapper[4720]: I0121 14:48:41.979456 4720 generic.go:334] "Generic (PLEG): container finished" podID="4775aea1-f465-4995-a37a-1285ed8229dd" containerID="59fd91b37bfcd11f4ff497c598ac3f209fb0f59dbb3d22d1cb6e9955f559e0d1" exitCode=0 Jan 21 14:48:41 crc kubenswrapper[4720]: I0121 14:48:41.979501 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" event={"ID":"4775aea1-f465-4995-a37a-1285ed8229dd","Type":"ContainerDied","Data":"59fd91b37bfcd11f4ff497c598ac3f209fb0f59dbb3d22d1cb6e9955f559e0d1"} Jan 21 14:48:42 crc kubenswrapper[4720]: I0121 14:48:42.008301 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c6979b468-whx5j" event={"ID":"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d","Type":"ContainerStarted","Data":"0ca615ad4570d56aa21daf5f2fa8602490f28f67d5858fadbe6bbdc9f51679be"} Jan 21 14:48:42 crc kubenswrapper[4720]: I0121 14:48:42.008365 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c6979b468-whx5j" event={"ID":"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d","Type":"ContainerStarted","Data":"3d7a3ecb27979a3b44e08cb4019116ada1170edb766f69f4f67fae7e26496dfb"} Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.011476 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.018552 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" event={"ID":"4775aea1-f465-4995-a37a-1285ed8229dd","Type":"ContainerDied","Data":"c620d8e3e9bc430960eba755b5a007861ddf15b09ef6b966ea58b1b2e0f572a3"} Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.018591 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c620d8e3e9bc430960eba755b5a007861ddf15b09ef6b966ea58b1b2e0f572a3" Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.020429 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c6979b468-whx5j" event={"ID":"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d","Type":"ContainerStarted","Data":"4098a5820a89817a76b4d0b719ebfb677003556b981d87803c50c6b583d0e1d7"} Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.020575 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.020603 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.021573 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9647cb32-4c23-445c-a66b-c71439bf617d","Type":"ContainerStarted","Data":"047ba4251d7b9546136638ee316dd3cee40c9a3f72ef38a8ab5feee0727fa5c8"} Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.022626 4720 generic.go:334] "Generic (PLEG): container finished" podID="2a0b57dc-517a-404a-a47d-1f86009fad51" containerID="4d8b9a33cc2b4409a467cae14fe05fabf4e1586debbfc3178a4978e092725506" exitCode=0 Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.022669 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" event={"ID":"2a0b57dc-517a-404a-a47d-1f86009fad51","Type":"ContainerDied","Data":"4d8b9a33cc2b4409a467cae14fe05fabf4e1586debbfc3178a4978e092725506"} Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.030498 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.046539 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7c6979b468-whx5j" podStartSLOduration=4.046522812 podStartE2EDuration="4.046522812s" podCreationTimestamp="2026-01-21 14:48:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:48:43.044287744 +0000 UTC m=+1160.953027676" watchObservedRunningTime="2026-01-21 14:48:43.046522812 +0000 UTC m=+1160.955262744" Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.087631 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-ovsdbserver-nb\") pod \"4775aea1-f465-4995-a37a-1285ed8229dd\" (UID: \"4775aea1-f465-4995-a37a-1285ed8229dd\") " Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.087693 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtkhq\" (UniqueName: \"kubernetes.io/projected/4775aea1-f465-4995-a37a-1285ed8229dd-kube-api-access-gtkhq\") pod \"4775aea1-f465-4995-a37a-1285ed8229dd\" (UID: \"4775aea1-f465-4995-a37a-1285ed8229dd\") " Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.087758 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-config\") pod \"4775aea1-f465-4995-a37a-1285ed8229dd\" (UID: \"4775aea1-f465-4995-a37a-1285ed8229dd\") " Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.087800 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-dns-svc\") pod \"4775aea1-f465-4995-a37a-1285ed8229dd\" (UID: \"4775aea1-f465-4995-a37a-1285ed8229dd\") " Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.087838 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-ovsdbserver-sb\") pod \"4775aea1-f465-4995-a37a-1285ed8229dd\" (UID: \"4775aea1-f465-4995-a37a-1285ed8229dd\") " Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.116271 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4775aea1-f465-4995-a37a-1285ed8229dd-kube-api-access-gtkhq" (OuterVolumeSpecName: "kube-api-access-gtkhq") pod "4775aea1-f465-4995-a37a-1285ed8229dd" (UID: "4775aea1-f465-4995-a37a-1285ed8229dd"). InnerVolumeSpecName "kube-api-access-gtkhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.124096 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4775aea1-f465-4995-a37a-1285ed8229dd" (UID: "4775aea1-f465-4995-a37a-1285ed8229dd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.143848 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4775aea1-f465-4995-a37a-1285ed8229dd" (UID: "4775aea1-f465-4995-a37a-1285ed8229dd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.145232 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4775aea1-f465-4995-a37a-1285ed8229dd" (UID: "4775aea1-f465-4995-a37a-1285ed8229dd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.161184 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-config" (OuterVolumeSpecName: "config") pod "4775aea1-f465-4995-a37a-1285ed8229dd" (UID: "4775aea1-f465-4995-a37a-1285ed8229dd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.189994 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.190030 4720 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.190040 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.190050 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.190060 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtkhq\" (UniqueName: \"kubernetes.io/projected/4775aea1-f465-4995-a37a-1285ed8229dd-kube-api-access-gtkhq\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:44 crc kubenswrapper[4720]: I0121 14:48:44.029834 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" Jan 21 14:48:44 crc kubenswrapper[4720]: I0121 14:48:44.082549 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-qtbjz"] Jan 21 14:48:44 crc kubenswrapper[4720]: I0121 14:48:44.104776 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-qtbjz"] Jan 21 14:48:44 crc kubenswrapper[4720]: I0121 14:48:44.690542 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4775aea1-f465-4995-a37a-1285ed8229dd" path="/var/lib/kubelet/pods/4775aea1-f465-4995-a37a-1285ed8229dd/volumes" Jan 21 14:48:45 crc kubenswrapper[4720]: I0121 14:48:45.936049 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5f448c69d6-sjp2r"] Jan 21 14:48:45 crc kubenswrapper[4720]: E0121 14:48:45.937097 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4775aea1-f465-4995-a37a-1285ed8229dd" containerName="init" Jan 21 14:48:45 crc kubenswrapper[4720]: I0121 14:48:45.937179 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="4775aea1-f465-4995-a37a-1285ed8229dd" containerName="init" Jan 21 14:48:45 crc kubenswrapper[4720]: I0121 14:48:45.937436 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="4775aea1-f465-4995-a37a-1285ed8229dd" containerName="init" Jan 21 14:48:45 crc kubenswrapper[4720]: I0121 14:48:45.938368 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:45 crc kubenswrapper[4720]: I0121 14:48:45.941729 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Jan 21 14:48:45 crc kubenswrapper[4720]: I0121 14:48:45.942995 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Jan 21 14:48:45 crc kubenswrapper[4720]: I0121 14:48:45.985285 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5f448c69d6-sjp2r"] Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.038247 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnjzn\" (UniqueName: \"kubernetes.io/projected/3b177763-3020-4854-b45a-43d99221c670-kube-api-access-bnjzn\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.038306 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b177763-3020-4854-b45a-43d99221c670-config-data-custom\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.038372 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b177763-3020-4854-b45a-43d99221c670-combined-ca-bundle\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.038392 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b177763-3020-4854-b45a-43d99221c670-internal-tls-certs\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.038426 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b177763-3020-4854-b45a-43d99221c670-config-data\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.038454 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b177763-3020-4854-b45a-43d99221c670-public-tls-certs\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.038476 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b177763-3020-4854-b45a-43d99221c670-logs\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.139510 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b177763-3020-4854-b45a-43d99221c670-config-data-custom\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.139919 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b177763-3020-4854-b45a-43d99221c670-combined-ca-bundle\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.139949 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b177763-3020-4854-b45a-43d99221c670-internal-tls-certs\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.139996 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b177763-3020-4854-b45a-43d99221c670-config-data\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.140035 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b177763-3020-4854-b45a-43d99221c670-public-tls-certs\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.140063 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b177763-3020-4854-b45a-43d99221c670-logs\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.140129 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnjzn\" (UniqueName: \"kubernetes.io/projected/3b177763-3020-4854-b45a-43d99221c670-kube-api-access-bnjzn\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.144882 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b177763-3020-4854-b45a-43d99221c670-logs\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.146245 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b177763-3020-4854-b45a-43d99221c670-config-data-custom\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.147258 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b177763-3020-4854-b45a-43d99221c670-internal-tls-certs\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.147772 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b177763-3020-4854-b45a-43d99221c670-config-data\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.148222 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b177763-3020-4854-b45a-43d99221c670-combined-ca-bundle\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.149191 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b177763-3020-4854-b45a-43d99221c670-public-tls-certs\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.158327 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnjzn\" (UniqueName: \"kubernetes.io/projected/3b177763-3020-4854-b45a-43d99221c670-kube-api-access-bnjzn\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.294477 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:51 crc kubenswrapper[4720]: I0121 14:48:51.091939 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" event={"ID":"2a0b57dc-517a-404a-a47d-1f86009fad51","Type":"ContainerStarted","Data":"332550904f3e433cd4d02f319dc6acd4e70218fd003f8a0d716e6b8b5738ed95"} Jan 21 14:48:51 crc kubenswrapper[4720]: I0121 14:48:51.092888 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:48:51 crc kubenswrapper[4720]: I0121 14:48:51.098135 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-8f88c9d47-m5rzn" event={"ID":"9355d502-bf01-4465-996d-483d99b92954","Type":"ContainerStarted","Data":"17d4eadec8ff63e347f4a38724652faff4a563967bcc0c5d98afe8db8947c21c"} Jan 21 14:48:51 crc kubenswrapper[4720]: I0121 14:48:51.114568 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" podStartSLOduration=11.114554914 podStartE2EDuration="11.114554914s" podCreationTimestamp="2026-01-21 14:48:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:48:51.111829704 +0000 UTC m=+1169.020569626" watchObservedRunningTime="2026-01-21 14:48:51.114554914 +0000 UTC m=+1169.023294846" Jan 21 14:48:51 crc kubenswrapper[4720]: I0121 14:48:51.175616 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5f448c69d6-sjp2r"] Jan 21 14:48:51 crc kubenswrapper[4720]: W0121 14:48:51.191535 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b177763_3020_4854_b45a_43d99221c670.slice/crio-dc99a332bef6104813b852ad966f46a1fbf12baa0af779fc62cb93beb67238d8 WatchSource:0}: Error finding container dc99a332bef6104813b852ad966f46a1fbf12baa0af779fc62cb93beb67238d8: Status 404 returned error can't find the container with id dc99a332bef6104813b852ad966f46a1fbf12baa0af779fc62cb93beb67238d8 Jan 21 14:48:51 crc kubenswrapper[4720]: I0121 14:48:51.375309 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:48:51 crc kubenswrapper[4720]: I0121 14:48:51.880689 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.133695 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5f448c69d6-sjp2r" event={"ID":"3b177763-3020-4854-b45a-43d99221c670","Type":"ContainerStarted","Data":"a072d993c4c9b086f9b3fd34af075479c303033d57f5e867fb77f4b5a3a1001f"} Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.133990 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5f448c69d6-sjp2r" event={"ID":"3b177763-3020-4854-b45a-43d99221c670","Type":"ContainerStarted","Data":"6a46c297738169de3ef3d1ba7f16f1ce96061244f5167c53d32ec12f7a944075"} Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.134004 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5f448c69d6-sjp2r" event={"ID":"3b177763-3020-4854-b45a-43d99221c670","Type":"ContainerStarted","Data":"dc99a332bef6104813b852ad966f46a1fbf12baa0af779fc62cb93beb67238d8"} Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.134174 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.134203 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.155246 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-8f88c9d47-m5rzn" event={"ID":"9355d502-bf01-4465-996d-483d99b92954","Type":"ContainerStarted","Data":"bb91bf9c5a54216cd959a646667850e4ce3f0d88eaac0af56180fdf9f8f472a6"} Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.171383 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5f448c69d6-sjp2r" podStartSLOduration=7.171355237 podStartE2EDuration="7.171355237s" podCreationTimestamp="2026-01-21 14:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:48:52.157294724 +0000 UTC m=+1170.066034676" watchObservedRunningTime="2026-01-21 14:48:52.171355237 +0000 UTC m=+1170.080095169" Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.179748 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" event={"ID":"bb475766-6891-454b-8f7e-1494d9806891","Type":"ContainerStarted","Data":"2e37e880e1ded60bfde91a112ff6c6e3bdb214678f5b94a391c433bade1ec4b8"} Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.179794 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" event={"ID":"bb475766-6891-454b-8f7e-1494d9806891","Type":"ContainerStarted","Data":"b481f179dc89b65a9e2d445c9f4cf87d2a628d4bc58ff169e717daab38105392"} Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.194634 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e","Type":"ContainerStarted","Data":"39475d8695e8971ed2751b2136ab747e574ad565cce4b09314e177ccf6e17f35"} Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.204108 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-8f88c9d47-m5rzn" podStartSLOduration=2.781047323 podStartE2EDuration="13.204091762s" podCreationTimestamp="2026-01-21 14:48:39 +0000 UTC" firstStartedPulling="2026-01-21 14:48:40.177979928 +0000 UTC m=+1158.086719860" lastFinishedPulling="2026-01-21 14:48:50.601024367 +0000 UTC m=+1168.509764299" observedRunningTime="2026-01-21 14:48:52.202500491 +0000 UTC m=+1170.111240453" watchObservedRunningTime="2026-01-21 14:48:52.204091762 +0000 UTC m=+1170.112831694" Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.218159 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2","Type":"ContainerStarted","Data":"5ea7ea8c41cb3182acf9d44e6f1a53a1e031ff9852935ca821816ea6bac8c0eb"} Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.218325 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" containerName="ceilometer-central-agent" containerID="cri-o://5ce080112e1278b271b9eee49e4451f7cdf38f35f931e7ced0dfd13eaa4d909c" gracePeriod=30 Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.218419 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.218450 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" containerName="sg-core" containerID="cri-o://222f35b26b0728beb62ec61698da4f41418c6e39b7da204a27ccfb0b40fb8d13" gracePeriod=30 Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.218499 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" containerName="proxy-httpd" containerID="cri-o://5ea7ea8c41cb3182acf9d44e6f1a53a1e031ff9852935ca821816ea6bac8c0eb" gracePeriod=30 Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.218543 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" containerName="ceilometer-notification-agent" containerID="cri-o://ae2697d4de89a62c9158e02846068cdf4d641ca5c8dccf5ea2f2fd831333be23" gracePeriod=30 Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.233790 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9647cb32-4c23-445c-a66b-c71439bf617d","Type":"ContainerStarted","Data":"e645392ee717e99317e92da4dc3564263b8bac8f71e81e0d2dbbe2e76fae43d8"} Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.234005 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.234031 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="9647cb32-4c23-445c-a66b-c71439bf617d" containerName="cinder-api" containerID="cri-o://e645392ee717e99317e92da4dc3564263b8bac8f71e81e0d2dbbe2e76fae43d8" gracePeriod=30 Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.234142 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="9647cb32-4c23-445c-a66b-c71439bf617d" containerName="cinder-api-log" containerID="cri-o://047ba4251d7b9546136638ee316dd3cee40c9a3f72ef38a8ab5feee0727fa5c8" gracePeriod=30 Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.240370 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" podStartSLOduration=2.9121960380000003 podStartE2EDuration="13.240353348s" podCreationTimestamp="2026-01-21 14:48:39 +0000 UTC" firstStartedPulling="2026-01-21 14:48:40.272802655 +0000 UTC m=+1158.181542587" lastFinishedPulling="2026-01-21 14:48:50.600959965 +0000 UTC m=+1168.509699897" observedRunningTime="2026-01-21 14:48:52.231196501 +0000 UTC m=+1170.139936433" watchObservedRunningTime="2026-01-21 14:48:52.240353348 +0000 UTC m=+1170.149093280" Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.264417 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.244804978 podStartE2EDuration="59.264400488s" podCreationTimestamp="2026-01-21 14:47:53 +0000 UTC" firstStartedPulling="2026-01-21 14:47:54.688124521 +0000 UTC m=+1112.596864453" lastFinishedPulling="2026-01-21 14:48:50.707720031 +0000 UTC m=+1168.616459963" observedRunningTime="2026-01-21 14:48:52.256080114 +0000 UTC m=+1170.164820056" watchObservedRunningTime="2026-01-21 14:48:52.264400488 +0000 UTC m=+1170.173140420" Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.284860 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=12.284840356 podStartE2EDuration="12.284840356s" podCreationTimestamp="2026-01-21 14:48:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:48:52.280894784 +0000 UTC m=+1170.189634726" watchObservedRunningTime="2026-01-21 14:48:52.284840356 +0000 UTC m=+1170.193580288" Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.911023 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:53 crc kubenswrapper[4720]: I0121 14:48:53.247379 4720 generic.go:334] "Generic (PLEG): container finished" podID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" containerID="5ea7ea8c41cb3182acf9d44e6f1a53a1e031ff9852935ca821816ea6bac8c0eb" exitCode=0 Jan 21 14:48:53 crc kubenswrapper[4720]: I0121 14:48:53.247411 4720 generic.go:334] "Generic (PLEG): container finished" podID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" containerID="222f35b26b0728beb62ec61698da4f41418c6e39b7da204a27ccfb0b40fb8d13" exitCode=2 Jan 21 14:48:53 crc kubenswrapper[4720]: I0121 14:48:53.247448 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2","Type":"ContainerDied","Data":"5ea7ea8c41cb3182acf9d44e6f1a53a1e031ff9852935ca821816ea6bac8c0eb"} Jan 21 14:48:53 crc kubenswrapper[4720]: I0121 14:48:53.247489 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2","Type":"ContainerDied","Data":"222f35b26b0728beb62ec61698da4f41418c6e39b7da204a27ccfb0b40fb8d13"} Jan 21 14:48:53 crc kubenswrapper[4720]: I0121 14:48:53.249396 4720 generic.go:334] "Generic (PLEG): container finished" podID="9647cb32-4c23-445c-a66b-c71439bf617d" containerID="047ba4251d7b9546136638ee316dd3cee40c9a3f72ef38a8ab5feee0727fa5c8" exitCode=143 Jan 21 14:48:53 crc kubenswrapper[4720]: I0121 14:48:53.250149 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9647cb32-4c23-445c-a66b-c71439bf617d","Type":"ContainerDied","Data":"047ba4251d7b9546136638ee316dd3cee40c9a3f72ef38a8ab5feee0727fa5c8"} Jan 21 14:48:54 crc kubenswrapper[4720]: I0121 14:48:54.258564 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e","Type":"ContainerStarted","Data":"89118313964a3db330a53f09f74c985ed214ae0d13b3b0a549e6e6f0d7d826d4"} Jan 21 14:48:54 crc kubenswrapper[4720]: I0121 14:48:54.261744 4720 generic.go:334] "Generic (PLEG): container finished" podID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" containerID="5ce080112e1278b271b9eee49e4451f7cdf38f35f931e7ced0dfd13eaa4d909c" exitCode=0 Jan 21 14:48:54 crc kubenswrapper[4720]: I0121 14:48:54.261816 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2","Type":"ContainerDied","Data":"5ce080112e1278b271b9eee49e4451f7cdf38f35f931e7ced0dfd13eaa4d909c"} Jan 21 14:48:54 crc kubenswrapper[4720]: I0121 14:48:54.278753 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=6.399146176 podStartE2EDuration="15.27873446s" podCreationTimestamp="2026-01-21 14:48:39 +0000 UTC" firstStartedPulling="2026-01-21 14:48:41.708396077 +0000 UTC m=+1159.617136009" lastFinishedPulling="2026-01-21 14:48:50.587984361 +0000 UTC m=+1168.496724293" observedRunningTime="2026-01-21 14:48:54.275993519 +0000 UTC m=+1172.184733471" watchObservedRunningTime="2026-01-21 14:48:54.27873446 +0000 UTC m=+1172.187474402" Jan 21 14:48:54 crc kubenswrapper[4720]: I0121 14:48:54.565405 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:54 crc kubenswrapper[4720]: I0121 14:48:54.639811 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7f78c5dfcb-hsblf"] Jan 21 14:48:54 crc kubenswrapper[4720]: I0121 14:48:54.640010 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7f78c5dfcb-hsblf" podUID="eef8d65a-fa41-4368-8368-4b50935db576" containerName="neutron-api" containerID="cri-o://e74c40ab383c4e2bd435cfe7f1033fab586921ebe963904a94ff3569cafdccd3" gracePeriod=30 Jan 21 14:48:54 crc kubenswrapper[4720]: I0121 14:48:54.640203 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7f78c5dfcb-hsblf" podUID="eef8d65a-fa41-4368-8368-4b50935db576" containerName="neutron-httpd" containerID="cri-o://3feae91b582c3edfa398af2c8e492be810876839abf0579e7a49dcb2911c2b6e" gracePeriod=30 Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.230305 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.275833 4720 generic.go:334] "Generic (PLEG): container finished" podID="eef8d65a-fa41-4368-8368-4b50935db576" containerID="3feae91b582c3edfa398af2c8e492be810876839abf0579e7a49dcb2911c2b6e" exitCode=0 Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.276149 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f78c5dfcb-hsblf" event={"ID":"eef8d65a-fa41-4368-8368-4b50935db576","Type":"ContainerDied","Data":"3feae91b582c3edfa398af2c8e492be810876839abf0579e7a49dcb2911c2b6e"} Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.285275 4720 generic.go:334] "Generic (PLEG): container finished" podID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" containerID="ae2697d4de89a62c9158e02846068cdf4d641ca5c8dccf5ea2f2fd831333be23" exitCode=0 Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.285444 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.285590 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2","Type":"ContainerDied","Data":"ae2697d4de89a62c9158e02846068cdf4d641ca5c8dccf5ea2f2fd831333be23"} Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.285636 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2","Type":"ContainerDied","Data":"97811cd0d01525b3eadfaadc0174563ab204a73195d95319ec50a029dadf2846"} Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.285677 4720 scope.go:117] "RemoveContainer" containerID="5ea7ea8c41cb3182acf9d44e6f1a53a1e031ff9852935ca821816ea6bac8c0eb" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.289204 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-log-httpd\") pod \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.289245 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-sg-core-conf-yaml\") pod \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.289275 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-scripts\") pod \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.289310 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6x4j\" (UniqueName: \"kubernetes.io/projected/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-kube-api-access-b6x4j\") pod \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.289362 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-config-data\") pod \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.289418 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-run-httpd\") pod \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.289448 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-combined-ca-bundle\") pod \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.289704 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" (UID: "cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.289844 4720 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.290361 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" (UID: "cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.308830 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-scripts" (OuterVolumeSpecName: "scripts") pod "cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" (UID: "cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.342858 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-kube-api-access-b6x4j" (OuterVolumeSpecName: "kube-api-access-b6x4j") pod "cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" (UID: "cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2"). InnerVolumeSpecName "kube-api-access-b6x4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.379266 4720 scope.go:117] "RemoveContainer" containerID="222f35b26b0728beb62ec61698da4f41418c6e39b7da204a27ccfb0b40fb8d13" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.379482 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.400632 4720 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.400680 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6x4j\" (UniqueName: \"kubernetes.io/projected/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-kube-api-access-b6x4j\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.400691 4720 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.411714 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" (UID: "cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.439624 4720 scope.go:117] "RemoveContainer" containerID="ae2697d4de89a62c9158e02846068cdf4d641ca5c8dccf5ea2f2fd831333be23" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.478190 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.482115 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.485740 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" (UID: "cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.502308 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.502336 4720 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.536950 4720 scope.go:117] "RemoveContainer" containerID="5ce080112e1278b271b9eee49e4451f7cdf38f35f931e7ced0dfd13eaa4d909c" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.546905 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-config-data" (OuterVolumeSpecName: "config-data") pod "cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" (UID: "cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.580279 4720 scope.go:117] "RemoveContainer" containerID="5ea7ea8c41cb3182acf9d44e6f1a53a1e031ff9852935ca821816ea6bac8c0eb" Jan 21 14:48:55 crc kubenswrapper[4720]: E0121 14:48:55.584963 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ea7ea8c41cb3182acf9d44e6f1a53a1e031ff9852935ca821816ea6bac8c0eb\": container with ID starting with 5ea7ea8c41cb3182acf9d44e6f1a53a1e031ff9852935ca821816ea6bac8c0eb not found: ID does not exist" containerID="5ea7ea8c41cb3182acf9d44e6f1a53a1e031ff9852935ca821816ea6bac8c0eb" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.585017 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ea7ea8c41cb3182acf9d44e6f1a53a1e031ff9852935ca821816ea6bac8c0eb"} err="failed to get container status \"5ea7ea8c41cb3182acf9d44e6f1a53a1e031ff9852935ca821816ea6bac8c0eb\": rpc error: code = NotFound desc = could not find container \"5ea7ea8c41cb3182acf9d44e6f1a53a1e031ff9852935ca821816ea6bac8c0eb\": container with ID starting with 5ea7ea8c41cb3182acf9d44e6f1a53a1e031ff9852935ca821816ea6bac8c0eb not found: ID does not exist" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.585060 4720 scope.go:117] "RemoveContainer" containerID="222f35b26b0728beb62ec61698da4f41418c6e39b7da204a27ccfb0b40fb8d13" Jan 21 14:48:55 crc kubenswrapper[4720]: E0121 14:48:55.585410 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"222f35b26b0728beb62ec61698da4f41418c6e39b7da204a27ccfb0b40fb8d13\": container with ID starting with 222f35b26b0728beb62ec61698da4f41418c6e39b7da204a27ccfb0b40fb8d13 not found: ID does not exist" containerID="222f35b26b0728beb62ec61698da4f41418c6e39b7da204a27ccfb0b40fb8d13" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.585477 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"222f35b26b0728beb62ec61698da4f41418c6e39b7da204a27ccfb0b40fb8d13"} err="failed to get container status \"222f35b26b0728beb62ec61698da4f41418c6e39b7da204a27ccfb0b40fb8d13\": rpc error: code = NotFound desc = could not find container \"222f35b26b0728beb62ec61698da4f41418c6e39b7da204a27ccfb0b40fb8d13\": container with ID starting with 222f35b26b0728beb62ec61698da4f41418c6e39b7da204a27ccfb0b40fb8d13 not found: ID does not exist" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.585495 4720 scope.go:117] "RemoveContainer" containerID="ae2697d4de89a62c9158e02846068cdf4d641ca5c8dccf5ea2f2fd831333be23" Jan 21 14:48:55 crc kubenswrapper[4720]: E0121 14:48:55.585761 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae2697d4de89a62c9158e02846068cdf4d641ca5c8dccf5ea2f2fd831333be23\": container with ID starting with ae2697d4de89a62c9158e02846068cdf4d641ca5c8dccf5ea2f2fd831333be23 not found: ID does not exist" containerID="ae2697d4de89a62c9158e02846068cdf4d641ca5c8dccf5ea2f2fd831333be23" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.585825 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae2697d4de89a62c9158e02846068cdf4d641ca5c8dccf5ea2f2fd831333be23"} err="failed to get container status \"ae2697d4de89a62c9158e02846068cdf4d641ca5c8dccf5ea2f2fd831333be23\": rpc error: code = NotFound desc = could not find container \"ae2697d4de89a62c9158e02846068cdf4d641ca5c8dccf5ea2f2fd831333be23\": container with ID starting with ae2697d4de89a62c9158e02846068cdf4d641ca5c8dccf5ea2f2fd831333be23 not found: ID does not exist" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.585848 4720 scope.go:117] "RemoveContainer" containerID="5ce080112e1278b271b9eee49e4451f7cdf38f35f931e7ced0dfd13eaa4d909c" Jan 21 14:48:55 crc kubenswrapper[4720]: E0121 14:48:55.586117 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ce080112e1278b271b9eee49e4451f7cdf38f35f931e7ced0dfd13eaa4d909c\": container with ID starting with 5ce080112e1278b271b9eee49e4451f7cdf38f35f931e7ced0dfd13eaa4d909c not found: ID does not exist" containerID="5ce080112e1278b271b9eee49e4451f7cdf38f35f931e7ced0dfd13eaa4d909c" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.586168 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ce080112e1278b271b9eee49e4451f7cdf38f35f931e7ced0dfd13eaa4d909c"} err="failed to get container status \"5ce080112e1278b271b9eee49e4451f7cdf38f35f931e7ced0dfd13eaa4d909c\": rpc error: code = NotFound desc = could not find container \"5ce080112e1278b271b9eee49e4451f7cdf38f35f931e7ced0dfd13eaa4d909c\": container with ID starting with 5ce080112e1278b271b9eee49e4451f7cdf38f35f931e7ced0dfd13eaa4d909c not found: ID does not exist" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.604083 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.640867 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.646170 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.732248 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:48:55 crc kubenswrapper[4720]: E0121 14:48:55.732972 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" containerName="sg-core" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.732991 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" containerName="sg-core" Jan 21 14:48:55 crc kubenswrapper[4720]: E0121 14:48:55.733012 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" containerName="ceilometer-notification-agent" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.733019 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" containerName="ceilometer-notification-agent" Jan 21 14:48:55 crc kubenswrapper[4720]: E0121 14:48:55.733039 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" containerName="ceilometer-central-agent" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.733046 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" containerName="ceilometer-central-agent" Jan 21 14:48:55 crc kubenswrapper[4720]: E0121 14:48:55.733084 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" containerName="proxy-httpd" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.733091 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" containerName="proxy-httpd" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.733356 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" containerName="sg-core" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.733377 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" containerName="ceilometer-notification-agent" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.733401 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" containerName="ceilometer-central-agent" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.733421 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" containerName="proxy-httpd" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.736022 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.753596 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.753820 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.757248 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.833726 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7gxq\" (UniqueName: \"kubernetes.io/projected/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-kube-api-access-m7gxq\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.833931 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-log-httpd\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.834014 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-run-httpd\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.834094 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-config-data\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.834190 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.834273 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.834367 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-scripts\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.935845 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-scripts\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.935967 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7gxq\" (UniqueName: \"kubernetes.io/projected/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-kube-api-access-m7gxq\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.935984 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-log-httpd\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.936001 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-run-httpd\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.936026 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-config-data\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.936049 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.936070 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.937276 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-run-httpd\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.937459 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-log-httpd\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.942585 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-scripts\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.942598 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.944016 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-config-data\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.954438 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.967376 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7gxq\" (UniqueName: \"kubernetes.io/projected/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-kube-api-access-m7gxq\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:56 crc kubenswrapper[4720]: I0121 14:48:56.089782 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:48:56 crc kubenswrapper[4720]: I0121 14:48:56.642589 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:48:56 crc kubenswrapper[4720]: I0121 14:48:56.692212 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" path="/var/lib/kubelet/pods/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2/volumes" Jan 21 14:48:57 crc kubenswrapper[4720]: I0121 14:48:57.311605 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"750b936e-3a77-4d1a-abc8-94f4a64cb5f7","Type":"ContainerStarted","Data":"3174443988b438c1b000ca3e584da6ddab213fd781467fefd432c56f8e7d99aa"} Jan 21 14:48:58 crc kubenswrapper[4720]: I0121 14:48:58.336986 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"750b936e-3a77-4d1a-abc8-94f4a64cb5f7","Type":"ContainerStarted","Data":"b4911b5a02877cc5031098b1df444df7accfc5e79babc4b70957cca0941831f5"} Jan 21 14:48:58 crc kubenswrapper[4720]: I0121 14:48:58.337515 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"750b936e-3a77-4d1a-abc8-94f4a64cb5f7","Type":"ContainerStarted","Data":"2ca946f6124a12c5254e3038b9d09ccabdea2284acbabae1dd0c4eb8aece072a"} Jan 21 14:48:58 crc kubenswrapper[4720]: I0121 14:48:58.430420 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:59 crc kubenswrapper[4720]: I0121 14:48:59.154176 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:59 crc kubenswrapper[4720]: I0121 14:48:59.220223 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7c6979b468-whx5j"] Jan 21 14:48:59 crc kubenswrapper[4720]: I0121 14:48:59.220436 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7c6979b468-whx5j" podUID="f8e11fac-bcca-42cf-ae8e-1d118f47fc1d" containerName="barbican-api-log" containerID="cri-o://0ca615ad4570d56aa21daf5f2fa8602490f28f67d5858fadbe6bbdc9f51679be" gracePeriod=30 Jan 21 14:48:59 crc kubenswrapper[4720]: I0121 14:48:59.220830 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7c6979b468-whx5j" podUID="f8e11fac-bcca-42cf-ae8e-1d118f47fc1d" containerName="barbican-api" containerID="cri-o://4098a5820a89817a76b4d0b719ebfb677003556b981d87803c50c6b583d0e1d7" gracePeriod=30 Jan 21 14:48:59 crc kubenswrapper[4720]: I0121 14:48:59.354831 4720 generic.go:334] "Generic (PLEG): container finished" podID="f8e11fac-bcca-42cf-ae8e-1d118f47fc1d" containerID="0ca615ad4570d56aa21daf5f2fa8602490f28f67d5858fadbe6bbdc9f51679be" exitCode=143 Jan 21 14:48:59 crc kubenswrapper[4720]: I0121 14:48:59.354892 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c6979b468-whx5j" event={"ID":"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d","Type":"ContainerDied","Data":"0ca615ad4570d56aa21daf5f2fa8602490f28f67d5858fadbe6bbdc9f51679be"} Jan 21 14:48:59 crc kubenswrapper[4720]: I0121 14:48:59.361538 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"750b936e-3a77-4d1a-abc8-94f4a64cb5f7","Type":"ContainerStarted","Data":"e3ddf7580cec87f0fd50cb9869ba5c47ad980eb5e9ee4ffcde67a0271a42b7ae"} Jan 21 14:48:59 crc kubenswrapper[4720]: I0121 14:48:59.836925 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 21 14:49:00 crc kubenswrapper[4720]: I0121 14:49:00.808896 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:49:00 crc kubenswrapper[4720]: I0121 14:49:00.890799 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-99cnk"] Jan 21 14:49:00 crc kubenswrapper[4720]: I0121 14:49:00.891350 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b946d459c-99cnk" podUID="c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01" containerName="dnsmasq-dns" containerID="cri-o://7ae7c07ef8890756b398637a11b0756cd3d10ee29c05e003e54c7ef091337410" gracePeriod=10 Jan 21 14:49:00 crc kubenswrapper[4720]: I0121 14:49:00.992244 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.008450 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7b946d459c-99cnk" podUID="c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.140:5353: connect: connection refused" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.055806 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.384701 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.385060 4720 generic.go:334] "Generic (PLEG): container finished" podID="c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01" containerID="7ae7c07ef8890756b398637a11b0756cd3d10ee29c05e003e54c7ef091337410" exitCode=0 Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.385109 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-99cnk" event={"ID":"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01","Type":"ContainerDied","Data":"7ae7c07ef8890756b398637a11b0756cd3d10ee29c05e003e54c7ef091337410"} Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.395578 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"750b936e-3a77-4d1a-abc8-94f4a64cb5f7","Type":"ContainerStarted","Data":"c0b245aa16f982abb708af6bfe73d8463cad116c5e48529117c9a2f4d4dd5ba2"} Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.396514 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.419685 4720 generic.go:334] "Generic (PLEG): container finished" podID="eef8d65a-fa41-4368-8368-4b50935db576" containerID="e74c40ab383c4e2bd435cfe7f1033fab586921ebe963904a94ff3569cafdccd3" exitCode=0 Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.419792 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.419850 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f78c5dfcb-hsblf" event={"ID":"eef8d65a-fa41-4368-8368-4b50935db576","Type":"ContainerDied","Data":"e74c40ab383c4e2bd435cfe7f1033fab586921ebe963904a94ff3569cafdccd3"} Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.419877 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f78c5dfcb-hsblf" event={"ID":"eef8d65a-fa41-4368-8368-4b50935db576","Type":"ContainerDied","Data":"68237d054d159a4765c532ce618148ffe9f3f4fde0b73c4c9dd1d07a506b6603"} Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.419920 4720 scope.go:117] "RemoveContainer" containerID="3feae91b582c3edfa398af2c8e492be810876839abf0579e7a49dcb2911c2b6e" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.420308 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e" containerName="cinder-scheduler" containerID="cri-o://39475d8695e8971ed2751b2136ab747e574ad565cce4b09314e177ccf6e17f35" gracePeriod=30 Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.420456 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e" containerName="probe" containerID="cri-o://89118313964a3db330a53f09f74c985ed214ae0d13b3b0a549e6e6f0d7d826d4" gracePeriod=30 Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.452226 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjrlh\" (UniqueName: \"kubernetes.io/projected/eef8d65a-fa41-4368-8368-4b50935db576-kube-api-access-kjrlh\") pod \"eef8d65a-fa41-4368-8368-4b50935db576\" (UID: \"eef8d65a-fa41-4368-8368-4b50935db576\") " Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.452529 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-ovndb-tls-certs\") pod \"eef8d65a-fa41-4368-8368-4b50935db576\" (UID: \"eef8d65a-fa41-4368-8368-4b50935db576\") " Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.452672 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-combined-ca-bundle\") pod \"eef8d65a-fa41-4368-8368-4b50935db576\" (UID: \"eef8d65a-fa41-4368-8368-4b50935db576\") " Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.452706 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-httpd-config\") pod \"eef8d65a-fa41-4368-8368-4b50935db576\" (UID: \"eef8d65a-fa41-4368-8368-4b50935db576\") " Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.452759 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-config\") pod \"eef8d65a-fa41-4368-8368-4b50935db576\" (UID: \"eef8d65a-fa41-4368-8368-4b50935db576\") " Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.468174 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.009648277 podStartE2EDuration="6.46814408s" podCreationTimestamp="2026-01-21 14:48:55 +0000 UTC" firstStartedPulling="2026-01-21 14:48:56.654331218 +0000 UTC m=+1174.563071150" lastFinishedPulling="2026-01-21 14:49:00.112827021 +0000 UTC m=+1178.021566953" observedRunningTime="2026-01-21 14:49:01.45223347 +0000 UTC m=+1179.360973412" watchObservedRunningTime="2026-01-21 14:49:01.46814408 +0000 UTC m=+1179.376884012" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.481187 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eef8d65a-fa41-4368-8368-4b50935db576-kube-api-access-kjrlh" (OuterVolumeSpecName: "kube-api-access-kjrlh") pod "eef8d65a-fa41-4368-8368-4b50935db576" (UID: "eef8d65a-fa41-4368-8368-4b50935db576"). InnerVolumeSpecName "kube-api-access-kjrlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.491029 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "eef8d65a-fa41-4368-8368-4b50935db576" (UID: "eef8d65a-fa41-4368-8368-4b50935db576"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.556983 4720 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.557011 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjrlh\" (UniqueName: \"kubernetes.io/projected/eef8d65a-fa41-4368-8368-4b50935db576-kube-api-access-kjrlh\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.584200 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eef8d65a-fa41-4368-8368-4b50935db576" (UID: "eef8d65a-fa41-4368-8368-4b50935db576"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.608404 4720 scope.go:117] "RemoveContainer" containerID="e74c40ab383c4e2bd435cfe7f1033fab586921ebe963904a94ff3569cafdccd3" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.611815 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.615732 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-config" (OuterVolumeSpecName: "config") pod "eef8d65a-fa41-4368-8368-4b50935db576" (UID: "eef8d65a-fa41-4368-8368-4b50935db576"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.648826 4720 scope.go:117] "RemoveContainer" containerID="3feae91b582c3edfa398af2c8e492be810876839abf0579e7a49dcb2911c2b6e" Jan 21 14:49:01 crc kubenswrapper[4720]: E0121 14:49:01.656432 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3feae91b582c3edfa398af2c8e492be810876839abf0579e7a49dcb2911c2b6e\": container with ID starting with 3feae91b582c3edfa398af2c8e492be810876839abf0579e7a49dcb2911c2b6e not found: ID does not exist" containerID="3feae91b582c3edfa398af2c8e492be810876839abf0579e7a49dcb2911c2b6e" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.656502 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3feae91b582c3edfa398af2c8e492be810876839abf0579e7a49dcb2911c2b6e"} err="failed to get container status \"3feae91b582c3edfa398af2c8e492be810876839abf0579e7a49dcb2911c2b6e\": rpc error: code = NotFound desc = could not find container \"3feae91b582c3edfa398af2c8e492be810876839abf0579e7a49dcb2911c2b6e\": container with ID starting with 3feae91b582c3edfa398af2c8e492be810876839abf0579e7a49dcb2911c2b6e not found: ID does not exist" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.656536 4720 scope.go:117] "RemoveContainer" containerID="e74c40ab383c4e2bd435cfe7f1033fab586921ebe963904a94ff3569cafdccd3" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.660013 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-config\") pod \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\" (UID: \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\") " Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.660086 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lp7tz\" (UniqueName: \"kubernetes.io/projected/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-kube-api-access-lp7tz\") pod \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\" (UID: \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\") " Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.660119 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-ovsdbserver-sb\") pod \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\" (UID: \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\") " Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.660168 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-dns-svc\") pod \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\" (UID: \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\") " Jan 21 14:49:01 crc kubenswrapper[4720]: E0121 14:49:01.660183 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e74c40ab383c4e2bd435cfe7f1033fab586921ebe963904a94ff3569cafdccd3\": container with ID starting with e74c40ab383c4e2bd435cfe7f1033fab586921ebe963904a94ff3569cafdccd3 not found: ID does not exist" containerID="e74c40ab383c4e2bd435cfe7f1033fab586921ebe963904a94ff3569cafdccd3" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.660234 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-ovsdbserver-nb\") pod \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\" (UID: \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\") " Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.660598 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.660616 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.660225 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e74c40ab383c4e2bd435cfe7f1033fab586921ebe963904a94ff3569cafdccd3"} err="failed to get container status \"e74c40ab383c4e2bd435cfe7f1033fab586921ebe963904a94ff3569cafdccd3\": rpc error: code = NotFound desc = could not find container \"e74c40ab383c4e2bd435cfe7f1033fab586921ebe963904a94ff3569cafdccd3\": container with ID starting with e74c40ab383c4e2bd435cfe7f1033fab586921ebe963904a94ff3569cafdccd3 not found: ID does not exist" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.676009 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "eef8d65a-fa41-4368-8368-4b50935db576" (UID: "eef8d65a-fa41-4368-8368-4b50935db576"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.688899 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-kube-api-access-lp7tz" (OuterVolumeSpecName: "kube-api-access-lp7tz") pod "c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01" (UID: "c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01"). InnerVolumeSpecName "kube-api-access-lp7tz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.728199 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01" (UID: "c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.728237 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01" (UID: "c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.739322 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-config" (OuterVolumeSpecName: "config") pod "c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01" (UID: "c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.759353 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01" (UID: "c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.762344 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.762373 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lp7tz\" (UniqueName: \"kubernetes.io/projected/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-kube-api-access-lp7tz\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.762384 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.762392 4720 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.762402 4720 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.762409 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.762802 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7f78c5dfcb-hsblf"] Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.768638 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7f78c5dfcb-hsblf"] Jan 21 14:49:02 crc kubenswrapper[4720]: I0121 14:49:02.428016 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-99cnk" event={"ID":"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01","Type":"ContainerDied","Data":"63bb1ec50feeba6e58d30594b8e6156c8d4ea90e8cd1ba58353f3003db3dc734"} Jan 21 14:49:02 crc kubenswrapper[4720]: I0121 14:49:02.428202 4720 scope.go:117] "RemoveContainer" containerID="7ae7c07ef8890756b398637a11b0756cd3d10ee29c05e003e54c7ef091337410" Jan 21 14:49:02 crc kubenswrapper[4720]: I0121 14:49:02.428074 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:49:02 crc kubenswrapper[4720]: I0121 14:49:02.432962 4720 generic.go:334] "Generic (PLEG): container finished" podID="1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e" containerID="89118313964a3db330a53f09f74c985ed214ae0d13b3b0a549e6e6f0d7d826d4" exitCode=0 Jan 21 14:49:02 crc kubenswrapper[4720]: I0121 14:49:02.433059 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e","Type":"ContainerDied","Data":"89118313964a3db330a53f09f74c985ed214ae0d13b3b0a549e6e6f0d7d826d4"} Jan 21 14:49:02 crc kubenswrapper[4720]: I0121 14:49:02.447147 4720 scope.go:117] "RemoveContainer" containerID="6bd3c4506512bee2427c44ab8e73cd801e736b9aa463cb2377da3847b6955208" Jan 21 14:49:02 crc kubenswrapper[4720]: I0121 14:49:02.492872 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-99cnk"] Jan 21 14:49:02 crc kubenswrapper[4720]: I0121 14:49:02.497469 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-99cnk"] Jan 21 14:49:02 crc kubenswrapper[4720]: I0121 14:49:02.695355 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01" path="/var/lib/kubelet/pods/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01/volumes" Jan 21 14:49:02 crc kubenswrapper[4720]: I0121 14:49:02.696041 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eef8d65a-fa41-4368-8368-4b50935db576" path="/var/lib/kubelet/pods/eef8d65a-fa41-4368-8368-4b50935db576/volumes" Jan 21 14:49:02 crc kubenswrapper[4720]: I0121 14:49:02.771728 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7c6979b468-whx5j" podUID="f8e11fac-bcca-42cf-ae8e-1d118f47fc1d" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.148:9311/healthcheck\": read tcp 10.217.0.2:51548->10.217.0.148:9311: read: connection reset by peer" Jan 21 14:49:02 crc kubenswrapper[4720]: I0121 14:49:02.771958 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7c6979b468-whx5j" podUID="f8e11fac-bcca-42cf-ae8e-1d118f47fc1d" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.148:9311/healthcheck\": read tcp 10.217.0.2:51558->10.217.0.148:9311: read: connection reset by peer" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.044140 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.096570 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-etc-machine-id\") pod \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.096645 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-config-data-custom\") pod \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.096695 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-combined-ca-bundle\") pod \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.096759 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-scripts\") pod \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.096875 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-config-data\") pod \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.096897 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bn2ct\" (UniqueName: \"kubernetes.io/projected/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-kube-api-access-bn2ct\") pod \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.099769 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e" (UID: "1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.107853 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e" (UID: "1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.107864 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-kube-api-access-bn2ct" (OuterVolumeSpecName: "kube-api-access-bn2ct") pod "1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e" (UID: "1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e"). InnerVolumeSpecName "kube-api-access-bn2ct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.107838 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-scripts" (OuterVolumeSpecName: "scripts") pod "1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e" (UID: "1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.199434 4720 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.199496 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bn2ct\" (UniqueName: \"kubernetes.io/projected/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-kube-api-access-bn2ct\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.199512 4720 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.199547 4720 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.220785 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e" (UID: "1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.285042 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-config-data" (OuterVolumeSpecName: "config-data") pod "1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e" (UID: "1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.300732 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.300755 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.355472 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.401803 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzc7n\" (UniqueName: \"kubernetes.io/projected/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-kube-api-access-hzc7n\") pod \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\" (UID: \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\") " Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.401845 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-config-data\") pod \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\" (UID: \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\") " Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.401869 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-config-data-custom\") pod \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\" (UID: \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\") " Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.401938 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-combined-ca-bundle\") pod \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\" (UID: \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\") " Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.402006 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-logs\") pod \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\" (UID: \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\") " Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.402678 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-logs" (OuterVolumeSpecName: "logs") pod "f8e11fac-bcca-42cf-ae8e-1d118f47fc1d" (UID: "f8e11fac-bcca-42cf-ae8e-1d118f47fc1d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.424532 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-kube-api-access-hzc7n" (OuterVolumeSpecName: "kube-api-access-hzc7n") pod "f8e11fac-bcca-42cf-ae8e-1d118f47fc1d" (UID: "f8e11fac-bcca-42cf-ae8e-1d118f47fc1d"). InnerVolumeSpecName "kube-api-access-hzc7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.424885 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f8e11fac-bcca-42cf-ae8e-1d118f47fc1d" (UID: "f8e11fac-bcca-42cf-ae8e-1d118f47fc1d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.445059 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8e11fac-bcca-42cf-ae8e-1d118f47fc1d" (UID: "f8e11fac-bcca-42cf-ae8e-1d118f47fc1d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.448838 4720 generic.go:334] "Generic (PLEG): container finished" podID="f8e11fac-bcca-42cf-ae8e-1d118f47fc1d" containerID="4098a5820a89817a76b4d0b719ebfb677003556b981d87803c50c6b583d0e1d7" exitCode=0 Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.448898 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c6979b468-whx5j" event={"ID":"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d","Type":"ContainerDied","Data":"4098a5820a89817a76b4d0b719ebfb677003556b981d87803c50c6b583d0e1d7"} Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.448923 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c6979b468-whx5j" event={"ID":"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d","Type":"ContainerDied","Data":"3d7a3ecb27979a3b44e08cb4019116ada1170edb766f69f4f67fae7e26496dfb"} Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.448939 4720 scope.go:117] "RemoveContainer" containerID="4098a5820a89817a76b4d0b719ebfb677003556b981d87803c50c6b583d0e1d7" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.449024 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.464965 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-config-data" (OuterVolumeSpecName: "config-data") pod "f8e11fac-bcca-42cf-ae8e-1d118f47fc1d" (UID: "f8e11fac-bcca-42cf-ae8e-1d118f47fc1d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.466201 4720 generic.go:334] "Generic (PLEG): container finished" podID="1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e" containerID="39475d8695e8971ed2751b2136ab747e574ad565cce4b09314e177ccf6e17f35" exitCode=0 Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.466263 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e","Type":"ContainerDied","Data":"39475d8695e8971ed2751b2136ab747e574ad565cce4b09314e177ccf6e17f35"} Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.466273 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.466291 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e","Type":"ContainerDied","Data":"90a6eb13a4e9101f155273acf68ce13d955b44911e5a69b94d712c8ecc3ebd49"} Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.511120 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.511159 4720 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-logs\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.511171 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzc7n\" (UniqueName: \"kubernetes.io/projected/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-kube-api-access-hzc7n\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.511181 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.511192 4720 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.537189 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.537338 4720 scope.go:117] "RemoveContainer" containerID="0ca615ad4570d56aa21daf5f2fa8602490f28f67d5858fadbe6bbdc9f51679be" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.554415 4720 scope.go:117] "RemoveContainer" containerID="4098a5820a89817a76b4d0b719ebfb677003556b981d87803c50c6b583d0e1d7" Jan 21 14:49:03 crc kubenswrapper[4720]: E0121 14:49:03.554848 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4098a5820a89817a76b4d0b719ebfb677003556b981d87803c50c6b583d0e1d7\": container with ID starting with 4098a5820a89817a76b4d0b719ebfb677003556b981d87803c50c6b583d0e1d7 not found: ID does not exist" containerID="4098a5820a89817a76b4d0b719ebfb677003556b981d87803c50c6b583d0e1d7" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.554891 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4098a5820a89817a76b4d0b719ebfb677003556b981d87803c50c6b583d0e1d7"} err="failed to get container status \"4098a5820a89817a76b4d0b719ebfb677003556b981d87803c50c6b583d0e1d7\": rpc error: code = NotFound desc = could not find container \"4098a5820a89817a76b4d0b719ebfb677003556b981d87803c50c6b583d0e1d7\": container with ID starting with 4098a5820a89817a76b4d0b719ebfb677003556b981d87803c50c6b583d0e1d7 not found: ID does not exist" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.554917 4720 scope.go:117] "RemoveContainer" containerID="0ca615ad4570d56aa21daf5f2fa8602490f28f67d5858fadbe6bbdc9f51679be" Jan 21 14:49:03 crc kubenswrapper[4720]: E0121 14:49:03.555164 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ca615ad4570d56aa21daf5f2fa8602490f28f67d5858fadbe6bbdc9f51679be\": container with ID starting with 0ca615ad4570d56aa21daf5f2fa8602490f28f67d5858fadbe6bbdc9f51679be not found: ID does not exist" containerID="0ca615ad4570d56aa21daf5f2fa8602490f28f67d5858fadbe6bbdc9f51679be" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.555193 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ca615ad4570d56aa21daf5f2fa8602490f28f67d5858fadbe6bbdc9f51679be"} err="failed to get container status \"0ca615ad4570d56aa21daf5f2fa8602490f28f67d5858fadbe6bbdc9f51679be\": rpc error: code = NotFound desc = could not find container \"0ca615ad4570d56aa21daf5f2fa8602490f28f67d5858fadbe6bbdc9f51679be\": container with ID starting with 0ca615ad4570d56aa21daf5f2fa8602490f28f67d5858fadbe6bbdc9f51679be not found: ID does not exist" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.555214 4720 scope.go:117] "RemoveContainer" containerID="89118313964a3db330a53f09f74c985ed214ae0d13b3b0a549e6e6f0d7d826d4" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.555840 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.566548 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 21 14:49:03 crc kubenswrapper[4720]: E0121 14:49:03.566922 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e" containerName="cinder-scheduler" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.566940 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e" containerName="cinder-scheduler" Jan 21 14:49:03 crc kubenswrapper[4720]: E0121 14:49:03.566960 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8e11fac-bcca-42cf-ae8e-1d118f47fc1d" containerName="barbican-api-log" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.566966 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e11fac-bcca-42cf-ae8e-1d118f47fc1d" containerName="barbican-api-log" Jan 21 14:49:03 crc kubenswrapper[4720]: E0121 14:49:03.566978 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eef8d65a-fa41-4368-8368-4b50935db576" containerName="neutron-httpd" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.566983 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="eef8d65a-fa41-4368-8368-4b50935db576" containerName="neutron-httpd" Jan 21 14:49:03 crc kubenswrapper[4720]: E0121 14:49:03.566994 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8e11fac-bcca-42cf-ae8e-1d118f47fc1d" containerName="barbican-api" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.566999 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e11fac-bcca-42cf-ae8e-1d118f47fc1d" containerName="barbican-api" Jan 21 14:49:03 crc kubenswrapper[4720]: E0121 14:49:03.567011 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01" containerName="dnsmasq-dns" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.567018 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01" containerName="dnsmasq-dns" Jan 21 14:49:03 crc kubenswrapper[4720]: E0121 14:49:03.567030 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e" containerName="probe" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.567036 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e" containerName="probe" Jan 21 14:49:03 crc kubenswrapper[4720]: E0121 14:49:03.567044 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01" containerName="init" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.567049 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01" containerName="init" Jan 21 14:49:03 crc kubenswrapper[4720]: E0121 14:49:03.567058 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eef8d65a-fa41-4368-8368-4b50935db576" containerName="neutron-api" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.567063 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="eef8d65a-fa41-4368-8368-4b50935db576" containerName="neutron-api" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.567213 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8e11fac-bcca-42cf-ae8e-1d118f47fc1d" containerName="barbican-api" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.567227 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8e11fac-bcca-42cf-ae8e-1d118f47fc1d" containerName="barbican-api-log" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.567238 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e" containerName="probe" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.567247 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="eef8d65a-fa41-4368-8368-4b50935db576" containerName="neutron-httpd" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.567258 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="eef8d65a-fa41-4368-8368-4b50935db576" containerName="neutron-api" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.567268 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e" containerName="cinder-scheduler" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.567279 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01" containerName="dnsmasq-dns" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.568129 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.571898 4720 scope.go:117] "RemoveContainer" containerID="39475d8695e8971ed2751b2136ab747e574ad565cce4b09314e177ccf6e17f35" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.572926 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.580892 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.615098 4720 scope.go:117] "RemoveContainer" containerID="89118313964a3db330a53f09f74c985ed214ae0d13b3b0a549e6e6f0d7d826d4" Jan 21 14:49:03 crc kubenswrapper[4720]: E0121 14:49:03.615894 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89118313964a3db330a53f09f74c985ed214ae0d13b3b0a549e6e6f0d7d826d4\": container with ID starting with 89118313964a3db330a53f09f74c985ed214ae0d13b3b0a549e6e6f0d7d826d4 not found: ID does not exist" containerID="89118313964a3db330a53f09f74c985ed214ae0d13b3b0a549e6e6f0d7d826d4" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.615921 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89118313964a3db330a53f09f74c985ed214ae0d13b3b0a549e6e6f0d7d826d4"} err="failed to get container status \"89118313964a3db330a53f09f74c985ed214ae0d13b3b0a549e6e6f0d7d826d4\": rpc error: code = NotFound desc = could not find container \"89118313964a3db330a53f09f74c985ed214ae0d13b3b0a549e6e6f0d7d826d4\": container with ID starting with 89118313964a3db330a53f09f74c985ed214ae0d13b3b0a549e6e6f0d7d826d4 not found: ID does not exist" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.615942 4720 scope.go:117] "RemoveContainer" containerID="39475d8695e8971ed2751b2136ab747e574ad565cce4b09314e177ccf6e17f35" Jan 21 14:49:03 crc kubenswrapper[4720]: E0121 14:49:03.616281 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39475d8695e8971ed2751b2136ab747e574ad565cce4b09314e177ccf6e17f35\": container with ID starting with 39475d8695e8971ed2751b2136ab747e574ad565cce4b09314e177ccf6e17f35 not found: ID does not exist" containerID="39475d8695e8971ed2751b2136ab747e574ad565cce4b09314e177ccf6e17f35" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.616933 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39475d8695e8971ed2751b2136ab747e574ad565cce4b09314e177ccf6e17f35"} err="failed to get container status \"39475d8695e8971ed2751b2136ab747e574ad565cce4b09314e177ccf6e17f35\": rpc error: code = NotFound desc = could not find container \"39475d8695e8971ed2751b2136ab747e574ad565cce4b09314e177ccf6e17f35\": container with ID starting with 39475d8695e8971ed2751b2136ab747e574ad565cce4b09314e177ccf6e17f35 not found: ID does not exist" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.714719 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0896fa5e-6919-42bf-9e61-cf73218e9edf-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0896fa5e-6919-42bf-9e61-cf73218e9edf\") " pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.714764 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0896fa5e-6919-42bf-9e61-cf73218e9edf-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0896fa5e-6919-42bf-9e61-cf73218e9edf\") " pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.714811 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0896fa5e-6919-42bf-9e61-cf73218e9edf-scripts\") pod \"cinder-scheduler-0\" (UID: \"0896fa5e-6919-42bf-9e61-cf73218e9edf\") " pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.714861 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0896fa5e-6919-42bf-9e61-cf73218e9edf-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0896fa5e-6919-42bf-9e61-cf73218e9edf\") " pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.714893 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcr6l\" (UniqueName: \"kubernetes.io/projected/0896fa5e-6919-42bf-9e61-cf73218e9edf-kube-api-access-fcr6l\") pod \"cinder-scheduler-0\" (UID: \"0896fa5e-6919-42bf-9e61-cf73218e9edf\") " pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.714927 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0896fa5e-6919-42bf-9e61-cf73218e9edf-config-data\") pod \"cinder-scheduler-0\" (UID: \"0896fa5e-6919-42bf-9e61-cf73218e9edf\") " pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.805219 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7c6979b468-whx5j"] Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.809061 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7c6979b468-whx5j"] Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.821514 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcr6l\" (UniqueName: \"kubernetes.io/projected/0896fa5e-6919-42bf-9e61-cf73218e9edf-kube-api-access-fcr6l\") pod \"cinder-scheduler-0\" (UID: \"0896fa5e-6919-42bf-9e61-cf73218e9edf\") " pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.821581 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0896fa5e-6919-42bf-9e61-cf73218e9edf-config-data\") pod \"cinder-scheduler-0\" (UID: \"0896fa5e-6919-42bf-9e61-cf73218e9edf\") " pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.821623 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0896fa5e-6919-42bf-9e61-cf73218e9edf-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0896fa5e-6919-42bf-9e61-cf73218e9edf\") " pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.821642 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0896fa5e-6919-42bf-9e61-cf73218e9edf-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0896fa5e-6919-42bf-9e61-cf73218e9edf\") " pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.821710 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0896fa5e-6919-42bf-9e61-cf73218e9edf-scripts\") pod \"cinder-scheduler-0\" (UID: \"0896fa5e-6919-42bf-9e61-cf73218e9edf\") " pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.821761 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0896fa5e-6919-42bf-9e61-cf73218e9edf-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0896fa5e-6919-42bf-9e61-cf73218e9edf\") " pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.821850 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0896fa5e-6919-42bf-9e61-cf73218e9edf-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0896fa5e-6919-42bf-9e61-cf73218e9edf\") " pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.830329 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0896fa5e-6919-42bf-9e61-cf73218e9edf-scripts\") pod \"cinder-scheduler-0\" (UID: \"0896fa5e-6919-42bf-9e61-cf73218e9edf\") " pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.830772 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0896fa5e-6919-42bf-9e61-cf73218e9edf-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0896fa5e-6919-42bf-9e61-cf73218e9edf\") " pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.832431 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0896fa5e-6919-42bf-9e61-cf73218e9edf-config-data\") pod \"cinder-scheduler-0\" (UID: \"0896fa5e-6919-42bf-9e61-cf73218e9edf\") " pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.832790 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0896fa5e-6919-42bf-9e61-cf73218e9edf-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0896fa5e-6919-42bf-9e61-cf73218e9edf\") " pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.836477 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcr6l\" (UniqueName: \"kubernetes.io/projected/0896fa5e-6919-42bf-9e61-cf73218e9edf-kube-api-access-fcr6l\") pod \"cinder-scheduler-0\" (UID: \"0896fa5e-6919-42bf-9e61-cf73218e9edf\") " pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.898214 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 21 14:49:04 crc kubenswrapper[4720]: I0121 14:49:04.393434 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 21 14:49:04 crc kubenswrapper[4720]: I0121 14:49:04.504559 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0896fa5e-6919-42bf-9e61-cf73218e9edf","Type":"ContainerStarted","Data":"886885cb6e867abef9b8b6b44ea76bb57f82465acc480ae208bd577e2f5e06f7"} Jan 21 14:49:04 crc kubenswrapper[4720]: I0121 14:49:04.668196 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:49:04 crc kubenswrapper[4720]: I0121 14:49:04.691083 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e" path="/var/lib/kubelet/pods/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e/volumes" Jan 21 14:49:04 crc kubenswrapper[4720]: I0121 14:49:04.692046 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8e11fac-bcca-42cf-ae8e-1d118f47fc1d" path="/var/lib/kubelet/pods/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d/volumes" Jan 21 14:49:05 crc kubenswrapper[4720]: I0121 14:49:05.521499 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0896fa5e-6919-42bf-9e61-cf73218e9edf","Type":"ContainerStarted","Data":"872a9c27150d862707f5a91175d8e8e15aaba3e73cf731e7a9533577a4dde8c1"} Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.251727 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.252708 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.257544 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.258486 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.258816 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-5w78j" Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.262894 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.368981 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6-openstack-config-secret\") pod \"openstackclient\" (UID: \"4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6\") " pod="openstack/openstackclient" Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.369034 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6-openstack-config\") pod \"openstackclient\" (UID: \"4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6\") " pod="openstack/openstackclient" Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.369080 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnljv\" (UniqueName: \"kubernetes.io/projected/4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6-kube-api-access-pnljv\") pod \"openstackclient\" (UID: \"4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6\") " pod="openstack/openstackclient" Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.369133 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6\") " pod="openstack/openstackclient" Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.470623 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6-openstack-config-secret\") pod \"openstackclient\" (UID: \"4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6\") " pod="openstack/openstackclient" Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.470947 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6-openstack-config\") pod \"openstackclient\" (UID: \"4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6\") " pod="openstack/openstackclient" Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.471029 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnljv\" (UniqueName: \"kubernetes.io/projected/4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6-kube-api-access-pnljv\") pod \"openstackclient\" (UID: \"4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6\") " pod="openstack/openstackclient" Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.471111 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6\") " pod="openstack/openstackclient" Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.471740 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6-openstack-config\") pod \"openstackclient\" (UID: \"4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6\") " pod="openstack/openstackclient" Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.477618 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6\") " pod="openstack/openstackclient" Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.484447 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6-openstack-config-secret\") pod \"openstackclient\" (UID: \"4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6\") " pod="openstack/openstackclient" Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.491943 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnljv\" (UniqueName: \"kubernetes.io/projected/4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6-kube-api-access-pnljv\") pod \"openstackclient\" (UID: \"4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6\") " pod="openstack/openstackclient" Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.543301 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0896fa5e-6919-42bf-9e61-cf73218e9edf","Type":"ContainerStarted","Data":"9d964c86db80d0e6e48d9004a7e85b0f246592766bb6da83415075b51041b482"} Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.565093 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.565073282 podStartE2EDuration="3.565073282s" podCreationTimestamp="2026-01-21 14:49:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:49:06.565067712 +0000 UTC m=+1184.473807634" watchObservedRunningTime="2026-01-21 14:49:06.565073282 +0000 UTC m=+1184.473813214" Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.573050 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 21 14:49:07 crc kubenswrapper[4720]: I0121 14:49:07.125250 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 21 14:49:07 crc kubenswrapper[4720]: W0121 14:49:07.134937 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4bb447ec_c7a1_4d3b_bcb7_e05d5ead9fa6.slice/crio-bbbf83e8bc8b953b796a2c68fe326accc6d585d3ee98f79c58dfb574f45e9180 WatchSource:0}: Error finding container bbbf83e8bc8b953b796a2c68fe326accc6d585d3ee98f79c58dfb574f45e9180: Status 404 returned error can't find the container with id bbbf83e8bc8b953b796a2c68fe326accc6d585d3ee98f79c58dfb574f45e9180 Jan 21 14:49:07 crc kubenswrapper[4720]: I0121 14:49:07.550347 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6","Type":"ContainerStarted","Data":"bbbf83e8bc8b953b796a2c68fe326accc6d585d3ee98f79c58dfb574f45e9180"} Jan 21 14:49:08 crc kubenswrapper[4720]: I0121 14:49:08.898598 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 21 14:49:14 crc kubenswrapper[4720]: I0121 14:49:14.163590 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 21 14:49:14 crc kubenswrapper[4720]: I0121 14:49:14.959936 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:49:14 crc kubenswrapper[4720]: I0121 14:49:14.960384 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerName="ceilometer-central-agent" containerID="cri-o://2ca946f6124a12c5254e3038b9d09ccabdea2284acbabae1dd0c4eb8aece072a" gracePeriod=30 Jan 21 14:49:14 crc kubenswrapper[4720]: I0121 14:49:14.960507 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerName="proxy-httpd" containerID="cri-o://c0b245aa16f982abb708af6bfe73d8463cad116c5e48529117c9a2f4d4dd5ba2" gracePeriod=30 Jan 21 14:49:14 crc kubenswrapper[4720]: I0121 14:49:14.960544 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerName="sg-core" containerID="cri-o://e3ddf7580cec87f0fd50cb9869ba5c47ad980eb5e9ee4ffcde67a0271a42b7ae" gracePeriod=30 Jan 21 14:49:14 crc kubenswrapper[4720]: I0121 14:49:14.960575 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerName="ceilometer-notification-agent" containerID="cri-o://b4911b5a02877cc5031098b1df444df7accfc5e79babc4b70957cca0941831f5" gracePeriod=30 Jan 21 14:49:14 crc kubenswrapper[4720]: I0121 14:49:14.974213 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.153:3000/\": read tcp 10.217.0.2:49704->10.217.0.153:3000: read: connection reset by peer" Jan 21 14:49:15 crc kubenswrapper[4720]: I0121 14:49:15.631836 4720 generic.go:334] "Generic (PLEG): container finished" podID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerID="c0b245aa16f982abb708af6bfe73d8463cad116c5e48529117c9a2f4d4dd5ba2" exitCode=0 Jan 21 14:49:15 crc kubenswrapper[4720]: I0121 14:49:15.631880 4720 generic.go:334] "Generic (PLEG): container finished" podID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerID="e3ddf7580cec87f0fd50cb9869ba5c47ad980eb5e9ee4ffcde67a0271a42b7ae" exitCode=2 Jan 21 14:49:15 crc kubenswrapper[4720]: I0121 14:49:15.631890 4720 generic.go:334] "Generic (PLEG): container finished" podID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerID="2ca946f6124a12c5254e3038b9d09ccabdea2284acbabae1dd0c4eb8aece072a" exitCode=0 Jan 21 14:49:15 crc kubenswrapper[4720]: I0121 14:49:15.631898 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"750b936e-3a77-4d1a-abc8-94f4a64cb5f7","Type":"ContainerDied","Data":"c0b245aa16f982abb708af6bfe73d8463cad116c5e48529117c9a2f4d4dd5ba2"} Jan 21 14:49:15 crc kubenswrapper[4720]: I0121 14:49:15.631949 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"750b936e-3a77-4d1a-abc8-94f4a64cb5f7","Type":"ContainerDied","Data":"e3ddf7580cec87f0fd50cb9869ba5c47ad980eb5e9ee4ffcde67a0271a42b7ae"} Jan 21 14:49:15 crc kubenswrapper[4720]: I0121 14:49:15.631965 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"750b936e-3a77-4d1a-abc8-94f4a64cb5f7","Type":"ContainerDied","Data":"2ca946f6124a12c5254e3038b9d09ccabdea2284acbabae1dd0c4eb8aece072a"} Jan 21 14:49:17 crc kubenswrapper[4720]: I0121 14:49:17.655153 4720 generic.go:334] "Generic (PLEG): container finished" podID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerID="b4911b5a02877cc5031098b1df444df7accfc5e79babc4b70957cca0941831f5" exitCode=0 Jan 21 14:49:17 crc kubenswrapper[4720]: I0121 14:49:17.655246 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"750b936e-3a77-4d1a-abc8-94f4a64cb5f7","Type":"ContainerDied","Data":"b4911b5a02877cc5031098b1df444df7accfc5e79babc4b70957cca0941831f5"} Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.140574 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.270000 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-scripts\") pod \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.270114 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-log-httpd\") pod \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.270247 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-sg-core-conf-yaml\") pod \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.270274 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-run-httpd\") pod \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.270597 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "750b936e-3a77-4d1a-abc8-94f4a64cb5f7" (UID: "750b936e-3a77-4d1a-abc8-94f4a64cb5f7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.270615 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "750b936e-3a77-4d1a-abc8-94f4a64cb5f7" (UID: "750b936e-3a77-4d1a-abc8-94f4a64cb5f7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.270985 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-config-data\") pod \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.271029 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7gxq\" (UniqueName: \"kubernetes.io/projected/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-kube-api-access-m7gxq\") pod \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.271394 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-combined-ca-bundle\") pod \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.272090 4720 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.272116 4720 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.274893 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-kube-api-access-m7gxq" (OuterVolumeSpecName: "kube-api-access-m7gxq") pod "750b936e-3a77-4d1a-abc8-94f4a64cb5f7" (UID: "750b936e-3a77-4d1a-abc8-94f4a64cb5f7"). InnerVolumeSpecName "kube-api-access-m7gxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.275746 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-scripts" (OuterVolumeSpecName: "scripts") pod "750b936e-3a77-4d1a-abc8-94f4a64cb5f7" (UID: "750b936e-3a77-4d1a-abc8-94f4a64cb5f7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.296952 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "750b936e-3a77-4d1a-abc8-94f4a64cb5f7" (UID: "750b936e-3a77-4d1a-abc8-94f4a64cb5f7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.358855 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-config-data" (OuterVolumeSpecName: "config-data") pod "750b936e-3a77-4d1a-abc8-94f4a64cb5f7" (UID: "750b936e-3a77-4d1a-abc8-94f4a64cb5f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.358865 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "750b936e-3a77-4d1a-abc8-94f4a64cb5f7" (UID: "750b936e-3a77-4d1a-abc8-94f4a64cb5f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.373013 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.373045 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7gxq\" (UniqueName: \"kubernetes.io/projected/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-kube-api-access-m7gxq\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.373057 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.373065 4720 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.373073 4720 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.664219 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6","Type":"ContainerStarted","Data":"782e00c2d9ebe5ff06d1034e3f65d84ed7840161d45f1ff7d564ebc20b056494"} Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.667243 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"750b936e-3a77-4d1a-abc8-94f4a64cb5f7","Type":"ContainerDied","Data":"3174443988b438c1b000ca3e584da6ddab213fd781467fefd432c56f8e7d99aa"} Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.667273 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.667320 4720 scope.go:117] "RemoveContainer" containerID="c0b245aa16f982abb708af6bfe73d8463cad116c5e48529117c9a2f4d4dd5ba2" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.687003 4720 scope.go:117] "RemoveContainer" containerID="e3ddf7580cec87f0fd50cb9869ba5c47ad980eb5e9ee4ffcde67a0271a42b7ae" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.697816 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.935553431 podStartE2EDuration="12.697797088s" podCreationTimestamp="2026-01-21 14:49:06 +0000 UTC" firstStartedPulling="2026-01-21 14:49:07.136920554 +0000 UTC m=+1185.045660486" lastFinishedPulling="2026-01-21 14:49:17.899164201 +0000 UTC m=+1195.807904143" observedRunningTime="2026-01-21 14:49:18.679170757 +0000 UTC m=+1196.587910700" watchObservedRunningTime="2026-01-21 14:49:18.697797088 +0000 UTC m=+1196.606537020" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.727399 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.751790 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.774743 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:49:18 crc kubenswrapper[4720]: E0121 14:49:18.775162 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerName="ceilometer-central-agent" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.775185 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerName="ceilometer-central-agent" Jan 21 14:49:18 crc kubenswrapper[4720]: E0121 14:49:18.775205 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerName="ceilometer-notification-agent" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.775214 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerName="ceilometer-notification-agent" Jan 21 14:49:18 crc kubenswrapper[4720]: E0121 14:49:18.775231 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerName="proxy-httpd" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.775239 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerName="proxy-httpd" Jan 21 14:49:18 crc kubenswrapper[4720]: E0121 14:49:18.775265 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerName="sg-core" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.775273 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerName="sg-core" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.775457 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerName="ceilometer-central-agent" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.775473 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerName="proxy-httpd" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.775499 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerName="ceilometer-notification-agent" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.775508 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerName="sg-core" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.778235 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.778266 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.785756 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.785933 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.802967 4720 scope.go:117] "RemoveContainer" containerID="b4911b5a02877cc5031098b1df444df7accfc5e79babc4b70957cca0941831f5" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.836332 4720 scope.go:117] "RemoveContainer" containerID="2ca946f6124a12c5254e3038b9d09ccabdea2284acbabae1dd0c4eb8aece072a" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.886639 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-config-data\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.886693 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68f753ae-130b-46ab-a544-e694a81b09b0-log-httpd\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.886733 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.886768 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.886797 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68f753ae-130b-46ab-a544-e694a81b09b0-run-httpd\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.886832 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvmdf\" (UniqueName: \"kubernetes.io/projected/68f753ae-130b-46ab-a544-e694a81b09b0-kube-api-access-tvmdf\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.886877 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-scripts\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.988338 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68f753ae-130b-46ab-a544-e694a81b09b0-log-httpd\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.988633 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.988808 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.988884 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68f753ae-130b-46ab-a544-e694a81b09b0-log-httpd\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.989033 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68f753ae-130b-46ab-a544-e694a81b09b0-run-httpd\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.989161 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvmdf\" (UniqueName: \"kubernetes.io/projected/68f753ae-130b-46ab-a544-e694a81b09b0-kube-api-access-tvmdf\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.989270 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-scripts\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.989418 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-config-data\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.989331 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68f753ae-130b-46ab-a544-e694a81b09b0-run-httpd\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.993798 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-scripts\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.995192 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-config-data\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.999032 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.999954 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:19 crc kubenswrapper[4720]: I0121 14:49:19.007782 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvmdf\" (UniqueName: \"kubernetes.io/projected/68f753ae-130b-46ab-a544-e694a81b09b0-kube-api-access-tvmdf\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:19 crc kubenswrapper[4720]: I0121 14:49:19.108366 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:49:19 crc kubenswrapper[4720]: I0121 14:49:19.608265 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:49:19 crc kubenswrapper[4720]: W0121 14:49:19.612098 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68f753ae_130b_46ab_a544_e694a81b09b0.slice/crio-168dc9e1caf65495e6d22713c0516b4a76a2c6d8846fe3dd732420e7cb0959b9 WatchSource:0}: Error finding container 168dc9e1caf65495e6d22713c0516b4a76a2c6d8846fe3dd732420e7cb0959b9: Status 404 returned error can't find the container with id 168dc9e1caf65495e6d22713c0516b4a76a2c6d8846fe3dd732420e7cb0959b9 Jan 21 14:49:19 crc kubenswrapper[4720]: I0121 14:49:19.679837 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68f753ae-130b-46ab-a544-e694a81b09b0","Type":"ContainerStarted","Data":"168dc9e1caf65495e6d22713c0516b4a76a2c6d8846fe3dd732420e7cb0959b9"} Jan 21 14:49:19 crc kubenswrapper[4720]: I0121 14:49:19.808242 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:49:20 crc kubenswrapper[4720]: I0121 14:49:20.689049 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" path="/var/lib/kubelet/pods/750b936e-3a77-4d1a-abc8-94f4a64cb5f7/volumes" Jan 21 14:49:20 crc kubenswrapper[4720]: I0121 14:49:20.690260 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68f753ae-130b-46ab-a544-e694a81b09b0","Type":"ContainerStarted","Data":"fe8f0f865bcfbac8500256bd0011d0f3321a6c7dc7b1a223783f54471eacf3d7"} Jan 21 14:49:21 crc kubenswrapper[4720]: I0121 14:49:21.696934 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68f753ae-130b-46ab-a544-e694a81b09b0","Type":"ContainerStarted","Data":"80caf00710e62f883afdacc5c1851e84288e4f97b1d4abfb6840c04d5e1f8db2"} Jan 21 14:49:22 crc kubenswrapper[4720]: I0121 14:49:22.712633 4720 generic.go:334] "Generic (PLEG): container finished" podID="9647cb32-4c23-445c-a66b-c71439bf617d" containerID="e645392ee717e99317e92da4dc3564263b8bac8f71e81e0d2dbbe2e76fae43d8" exitCode=137 Jan 21 14:49:22 crc kubenswrapper[4720]: I0121 14:49:22.713177 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9647cb32-4c23-445c-a66b-c71439bf617d","Type":"ContainerDied","Data":"e645392ee717e99317e92da4dc3564263b8bac8f71e81e0d2dbbe2e76fae43d8"} Jan 21 14:49:22 crc kubenswrapper[4720]: I0121 14:49:22.725830 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68f753ae-130b-46ab-a544-e694a81b09b0","Type":"ContainerStarted","Data":"7864be8ab599e4e4751d80908f487622fb05f60c6fa32cf32ec247ac04ec10ee"} Jan 21 14:49:22 crc kubenswrapper[4720]: I0121 14:49:22.861360 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 21 14:49:22 crc kubenswrapper[4720]: I0121 14:49:22.879769 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:49:22 crc kubenswrapper[4720]: I0121 14:49:22.879816 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.059370 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-config-data\") pod \"9647cb32-4c23-445c-a66b-c71439bf617d\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.059432 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88pl4\" (UniqueName: \"kubernetes.io/projected/9647cb32-4c23-445c-a66b-c71439bf617d-kube-api-access-88pl4\") pod \"9647cb32-4c23-445c-a66b-c71439bf617d\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.059550 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-combined-ca-bundle\") pod \"9647cb32-4c23-445c-a66b-c71439bf617d\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.059575 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-config-data-custom\") pod \"9647cb32-4c23-445c-a66b-c71439bf617d\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.059706 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-scripts\") pod \"9647cb32-4c23-445c-a66b-c71439bf617d\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.059772 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9647cb32-4c23-445c-a66b-c71439bf617d-etc-machine-id\") pod \"9647cb32-4c23-445c-a66b-c71439bf617d\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.059824 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9647cb32-4c23-445c-a66b-c71439bf617d-logs\") pod \"9647cb32-4c23-445c-a66b-c71439bf617d\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.065789 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9647cb32-4c23-445c-a66b-c71439bf617d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9647cb32-4c23-445c-a66b-c71439bf617d" (UID: "9647cb32-4c23-445c-a66b-c71439bf617d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.066264 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9647cb32-4c23-445c-a66b-c71439bf617d-logs" (OuterVolumeSpecName: "logs") pod "9647cb32-4c23-445c-a66b-c71439bf617d" (UID: "9647cb32-4c23-445c-a66b-c71439bf617d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.083767 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9647cb32-4c23-445c-a66b-c71439bf617d" (UID: "9647cb32-4c23-445c-a66b-c71439bf617d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.092772 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9647cb32-4c23-445c-a66b-c71439bf617d-kube-api-access-88pl4" (OuterVolumeSpecName: "kube-api-access-88pl4") pod "9647cb32-4c23-445c-a66b-c71439bf617d" (UID: "9647cb32-4c23-445c-a66b-c71439bf617d"). InnerVolumeSpecName "kube-api-access-88pl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.102480 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-scripts" (OuterVolumeSpecName: "scripts") pod "9647cb32-4c23-445c-a66b-c71439bf617d" (UID: "9647cb32-4c23-445c-a66b-c71439bf617d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.135319 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9647cb32-4c23-445c-a66b-c71439bf617d" (UID: "9647cb32-4c23-445c-a66b-c71439bf617d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.141977 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-config-data" (OuterVolumeSpecName: "config-data") pod "9647cb32-4c23-445c-a66b-c71439bf617d" (UID: "9647cb32-4c23-445c-a66b-c71439bf617d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.162974 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.163008 4720 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.163017 4720 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.163025 4720 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9647cb32-4c23-445c-a66b-c71439bf617d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.163033 4720 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9647cb32-4c23-445c-a66b-c71439bf617d-logs\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.163041 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.163049 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88pl4\" (UniqueName: \"kubernetes.io/projected/9647cb32-4c23-445c-a66b-c71439bf617d-kube-api-access-88pl4\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.736470 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68f753ae-130b-46ab-a544-e694a81b09b0","Type":"ContainerStarted","Data":"162849d5232970e8a4f401d6bc1bb8b7acd38c2e6e26bea6a8783902f8ef0d61"} Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.736787 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="68f753ae-130b-46ab-a544-e694a81b09b0" containerName="ceilometer-central-agent" containerID="cri-o://fe8f0f865bcfbac8500256bd0011d0f3321a6c7dc7b1a223783f54471eacf3d7" gracePeriod=30 Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.736815 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.736866 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="68f753ae-130b-46ab-a544-e694a81b09b0" containerName="proxy-httpd" containerID="cri-o://162849d5232970e8a4f401d6bc1bb8b7acd38c2e6e26bea6a8783902f8ef0d61" gracePeriod=30 Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.736901 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="68f753ae-130b-46ab-a544-e694a81b09b0" containerName="sg-core" containerID="cri-o://7864be8ab599e4e4751d80908f487622fb05f60c6fa32cf32ec247ac04ec10ee" gracePeriod=30 Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.736931 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="68f753ae-130b-46ab-a544-e694a81b09b0" containerName="ceilometer-notification-agent" containerID="cri-o://80caf00710e62f883afdacc5c1851e84288e4f97b1d4abfb6840c04d5e1f8db2" gracePeriod=30 Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.742702 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9647cb32-4c23-445c-a66b-c71439bf617d","Type":"ContainerDied","Data":"88ba151a7faa9c14ce1ffb6ea4af84a9781883a8fd8016a1bc953df79c7742c2"} Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.742754 4720 scope.go:117] "RemoveContainer" containerID="e645392ee717e99317e92da4dc3564263b8bac8f71e81e0d2dbbe2e76fae43d8" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.742755 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.773774 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.037166814 podStartE2EDuration="5.773754037s" podCreationTimestamp="2026-01-21 14:49:18 +0000 UTC" firstStartedPulling="2026-01-21 14:49:19.61474523 +0000 UTC m=+1197.523485162" lastFinishedPulling="2026-01-21 14:49:23.351332463 +0000 UTC m=+1201.260072385" observedRunningTime="2026-01-21 14:49:23.764334945 +0000 UTC m=+1201.673074887" watchObservedRunningTime="2026-01-21 14:49:23.773754037 +0000 UTC m=+1201.682493969" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.789317 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.789471 4720 scope.go:117] "RemoveContainer" containerID="047ba4251d7b9546136638ee316dd3cee40c9a3f72ef38a8ab5feee0727fa5c8" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.807984 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.827182 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 21 14:49:23 crc kubenswrapper[4720]: E0121 14:49:23.827621 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9647cb32-4c23-445c-a66b-c71439bf617d" containerName="cinder-api-log" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.827644 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="9647cb32-4c23-445c-a66b-c71439bf617d" containerName="cinder-api-log" Jan 21 14:49:23 crc kubenswrapper[4720]: E0121 14:49:23.827690 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9647cb32-4c23-445c-a66b-c71439bf617d" containerName="cinder-api" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.827700 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="9647cb32-4c23-445c-a66b-c71439bf617d" containerName="cinder-api" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.827874 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="9647cb32-4c23-445c-a66b-c71439bf617d" containerName="cinder-api" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.827906 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="9647cb32-4c23-445c-a66b-c71439bf617d" containerName="cinder-api-log" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.828776 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.834240 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.834299 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.834614 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.842094 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.973406 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e4281fdf-eb56-41e8-a750-13ee7ac37bea-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.973483 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4281fdf-eb56-41e8-a750-13ee7ac37bea-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.973528 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4281fdf-eb56-41e8-a750-13ee7ac37bea-config-data-custom\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.973571 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4281fdf-eb56-41e8-a750-13ee7ac37bea-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.973603 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4281fdf-eb56-41e8-a750-13ee7ac37bea-config-data\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.973891 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4281fdf-eb56-41e8-a750-13ee7ac37bea-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.973981 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4281fdf-eb56-41e8-a750-13ee7ac37bea-scripts\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.974016 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvsr9\" (UniqueName: \"kubernetes.io/projected/e4281fdf-eb56-41e8-a750-13ee7ac37bea-kube-api-access-rvsr9\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.974056 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4281fdf-eb56-41e8-a750-13ee7ac37bea-logs\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.075669 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4281fdf-eb56-41e8-a750-13ee7ac37bea-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.075818 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4281fdf-eb56-41e8-a750-13ee7ac37bea-scripts\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.075860 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvsr9\" (UniqueName: \"kubernetes.io/projected/e4281fdf-eb56-41e8-a750-13ee7ac37bea-kube-api-access-rvsr9\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.075911 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4281fdf-eb56-41e8-a750-13ee7ac37bea-logs\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.075976 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e4281fdf-eb56-41e8-a750-13ee7ac37bea-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.076008 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4281fdf-eb56-41e8-a750-13ee7ac37bea-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.076067 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4281fdf-eb56-41e8-a750-13ee7ac37bea-config-data-custom\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.076068 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e4281fdf-eb56-41e8-a750-13ee7ac37bea-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.076088 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4281fdf-eb56-41e8-a750-13ee7ac37bea-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.076162 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4281fdf-eb56-41e8-a750-13ee7ac37bea-config-data\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.076394 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4281fdf-eb56-41e8-a750-13ee7ac37bea-logs\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.080071 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4281fdf-eb56-41e8-a750-13ee7ac37bea-config-data-custom\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.080191 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4281fdf-eb56-41e8-a750-13ee7ac37bea-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.080517 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4281fdf-eb56-41e8-a750-13ee7ac37bea-scripts\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.080697 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4281fdf-eb56-41e8-a750-13ee7ac37bea-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.083362 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4281fdf-eb56-41e8-a750-13ee7ac37bea-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.087085 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4281fdf-eb56-41e8-a750-13ee7ac37bea-config-data\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.102201 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvsr9\" (UniqueName: \"kubernetes.io/projected/e4281fdf-eb56-41e8-a750-13ee7ac37bea-kube-api-access-rvsr9\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.142582 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.672280 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.694155 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9647cb32-4c23-445c-a66b-c71439bf617d" path="/var/lib/kubelet/pods/9647cb32-4c23-445c-a66b-c71439bf617d/volumes" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.761082 4720 generic.go:334] "Generic (PLEG): container finished" podID="68f753ae-130b-46ab-a544-e694a81b09b0" containerID="162849d5232970e8a4f401d6bc1bb8b7acd38c2e6e26bea6a8783902f8ef0d61" exitCode=0 Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.761110 4720 generic.go:334] "Generic (PLEG): container finished" podID="68f753ae-130b-46ab-a544-e694a81b09b0" containerID="7864be8ab599e4e4751d80908f487622fb05f60c6fa32cf32ec247ac04ec10ee" exitCode=2 Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.761133 4720 generic.go:334] "Generic (PLEG): container finished" podID="68f753ae-130b-46ab-a544-e694a81b09b0" containerID="80caf00710e62f883afdacc5c1851e84288e4f97b1d4abfb6840c04d5e1f8db2" exitCode=0 Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.761140 4720 generic.go:334] "Generic (PLEG): container finished" podID="68f753ae-130b-46ab-a544-e694a81b09b0" containerID="fe8f0f865bcfbac8500256bd0011d0f3321a6c7dc7b1a223783f54471eacf3d7" exitCode=0 Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.761176 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68f753ae-130b-46ab-a544-e694a81b09b0","Type":"ContainerDied","Data":"162849d5232970e8a4f401d6bc1bb8b7acd38c2e6e26bea6a8783902f8ef0d61"} Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.761214 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68f753ae-130b-46ab-a544-e694a81b09b0","Type":"ContainerDied","Data":"7864be8ab599e4e4751d80908f487622fb05f60c6fa32cf32ec247ac04ec10ee"} Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.761224 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68f753ae-130b-46ab-a544-e694a81b09b0","Type":"ContainerDied","Data":"80caf00710e62f883afdacc5c1851e84288e4f97b1d4abfb6840c04d5e1f8db2"} Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.761233 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68f753ae-130b-46ab-a544-e694a81b09b0","Type":"ContainerDied","Data":"fe8f0f865bcfbac8500256bd0011d0f3321a6c7dc7b1a223783f54471eacf3d7"} Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.761242 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68f753ae-130b-46ab-a544-e694a81b09b0","Type":"ContainerDied","Data":"168dc9e1caf65495e6d22713c0516b4a76a2c6d8846fe3dd732420e7cb0959b9"} Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.761250 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="168dc9e1caf65495e6d22713c0516b4a76a2c6d8846fe3dd732420e7cb0959b9" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.762519 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e4281fdf-eb56-41e8-a750-13ee7ac37bea","Type":"ContainerStarted","Data":"4b94f6b9b9b43b4ec34335af772443783caa0a6f26b0a54adcd5907d5d0118ce"} Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.789058 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.887781 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-config-data\") pod \"68f753ae-130b-46ab-a544-e694a81b09b0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.887837 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68f753ae-130b-46ab-a544-e694a81b09b0-run-httpd\") pod \"68f753ae-130b-46ab-a544-e694a81b09b0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.887869 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68f753ae-130b-46ab-a544-e694a81b09b0-log-httpd\") pod \"68f753ae-130b-46ab-a544-e694a81b09b0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.887917 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-combined-ca-bundle\") pod \"68f753ae-130b-46ab-a544-e694a81b09b0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.887945 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvmdf\" (UniqueName: \"kubernetes.io/projected/68f753ae-130b-46ab-a544-e694a81b09b0-kube-api-access-tvmdf\") pod \"68f753ae-130b-46ab-a544-e694a81b09b0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.887979 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-scripts\") pod \"68f753ae-130b-46ab-a544-e694a81b09b0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.888013 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-sg-core-conf-yaml\") pod \"68f753ae-130b-46ab-a544-e694a81b09b0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.888261 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68f753ae-130b-46ab-a544-e694a81b09b0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "68f753ae-130b-46ab-a544-e694a81b09b0" (UID: "68f753ae-130b-46ab-a544-e694a81b09b0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.888431 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68f753ae-130b-46ab-a544-e694a81b09b0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "68f753ae-130b-46ab-a544-e694a81b09b0" (UID: "68f753ae-130b-46ab-a544-e694a81b09b0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.897840 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68f753ae-130b-46ab-a544-e694a81b09b0-kube-api-access-tvmdf" (OuterVolumeSpecName: "kube-api-access-tvmdf") pod "68f753ae-130b-46ab-a544-e694a81b09b0" (UID: "68f753ae-130b-46ab-a544-e694a81b09b0"). InnerVolumeSpecName "kube-api-access-tvmdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.903048 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-scripts" (OuterVolumeSpecName: "scripts") pod "68f753ae-130b-46ab-a544-e694a81b09b0" (UID: "68f753ae-130b-46ab-a544-e694a81b09b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.933113 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "68f753ae-130b-46ab-a544-e694a81b09b0" (UID: "68f753ae-130b-46ab-a544-e694a81b09b0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.989701 4720 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68f753ae-130b-46ab-a544-e694a81b09b0-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.989733 4720 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68f753ae-130b-46ab-a544-e694a81b09b0-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.989748 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvmdf\" (UniqueName: \"kubernetes.io/projected/68f753ae-130b-46ab-a544-e694a81b09b0-kube-api-access-tvmdf\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.989760 4720 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.989771 4720 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.992494 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68f753ae-130b-46ab-a544-e694a81b09b0" (UID: "68f753ae-130b-46ab-a544-e694a81b09b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.020929 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-config-data" (OuterVolumeSpecName: "config-data") pod "68f753ae-130b-46ab-a544-e694a81b09b0" (UID: "68f753ae-130b-46ab-a544-e694a81b09b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.091907 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.091948 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.655633 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-c5zqd"] Jan 21 14:49:25 crc kubenswrapper[4720]: E0121 14:49:25.656189 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68f753ae-130b-46ab-a544-e694a81b09b0" containerName="ceilometer-notification-agent" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.656201 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="68f753ae-130b-46ab-a544-e694a81b09b0" containerName="ceilometer-notification-agent" Jan 21 14:49:25 crc kubenswrapper[4720]: E0121 14:49:25.656215 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68f753ae-130b-46ab-a544-e694a81b09b0" containerName="ceilometer-central-agent" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.656222 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="68f753ae-130b-46ab-a544-e694a81b09b0" containerName="ceilometer-central-agent" Jan 21 14:49:25 crc kubenswrapper[4720]: E0121 14:49:25.656233 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68f753ae-130b-46ab-a544-e694a81b09b0" containerName="sg-core" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.656241 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="68f753ae-130b-46ab-a544-e694a81b09b0" containerName="sg-core" Jan 21 14:49:25 crc kubenswrapper[4720]: E0121 14:49:25.656252 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68f753ae-130b-46ab-a544-e694a81b09b0" containerName="proxy-httpd" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.656259 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="68f753ae-130b-46ab-a544-e694a81b09b0" containerName="proxy-httpd" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.656404 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="68f753ae-130b-46ab-a544-e694a81b09b0" containerName="proxy-httpd" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.656418 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="68f753ae-130b-46ab-a544-e694a81b09b0" containerName="ceilometer-central-agent" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.656429 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="68f753ae-130b-46ab-a544-e694a81b09b0" containerName="sg-core" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.656438 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="68f753ae-130b-46ab-a544-e694a81b09b0" containerName="ceilometer-notification-agent" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.664016 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-c5zqd" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.674481 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-c5zqd"] Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.776414 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e4281fdf-eb56-41e8-a750-13ee7ac37bea","Type":"ContainerStarted","Data":"9ca8e392ad7fbddfe43873c0128f07b8d0a7c9eb23029d723d3acbcb81226b55"} Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.776459 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.811045 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.812278 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9cf579e-cb45-4984-8558-107b9576d977-operator-scripts\") pod \"nova-api-db-create-c5zqd\" (UID: \"d9cf579e-cb45-4984-8558-107b9576d977\") " pod="openstack/nova-api-db-create-c5zqd" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.812323 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn6nt\" (UniqueName: \"kubernetes.io/projected/d9cf579e-cb45-4984-8558-107b9576d977-kube-api-access-dn6nt\") pod \"nova-api-db-create-c5zqd\" (UID: \"d9cf579e-cb45-4984-8558-107b9576d977\") " pod="openstack/nova-api-db-create-c5zqd" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.824596 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.862541 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.882125 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.889835 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-61ab-account-create-update-4mch7"] Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.890838 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-61ab-account-create-update-4mch7" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.894066 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.894227 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.900125 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.904213 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-62k9x"] Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.910132 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-62k9x" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.913991 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af31d5e0-11e6-433b-a31e-bea14d7e5c95-operator-scripts\") pod \"nova-api-61ab-account-create-update-4mch7\" (UID: \"af31d5e0-11e6-433b-a31e-bea14d7e5c95\") " pod="openstack/nova-api-61ab-account-create-update-4mch7" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.914097 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9cf579e-cb45-4984-8558-107b9576d977-operator-scripts\") pod \"nova-api-db-create-c5zqd\" (UID: \"d9cf579e-cb45-4984-8558-107b9576d977\") " pod="openstack/nova-api-db-create-c5zqd" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.914166 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj7h4\" (UniqueName: \"kubernetes.io/projected/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-kube-api-access-pj7h4\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.914212 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn6nt\" (UniqueName: \"kubernetes.io/projected/d9cf579e-cb45-4984-8558-107b9576d977-kube-api-access-dn6nt\") pod \"nova-api-db-create-c5zqd\" (UID: \"d9cf579e-cb45-4984-8558-107b9576d977\") " pod="openstack/nova-api-db-create-c5zqd" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.914257 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrp82\" (UniqueName: \"kubernetes.io/projected/af31d5e0-11e6-433b-a31e-bea14d7e5c95-kube-api-access-rrp82\") pod \"nova-api-61ab-account-create-update-4mch7\" (UID: \"af31d5e0-11e6-433b-a31e-bea14d7e5c95\") " pod="openstack/nova-api-61ab-account-create-update-4mch7" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.914322 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.914347 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-config-data\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.914407 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.914455 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-run-httpd\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.914538 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-scripts\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.914565 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7zqp\" (UniqueName: \"kubernetes.io/projected/a08abcad-85f1-431b-853e-3599eebed756-kube-api-access-n7zqp\") pod \"nova-cell0-db-create-62k9x\" (UID: \"a08abcad-85f1-431b-853e-3599eebed756\") " pod="openstack/nova-cell0-db-create-62k9x" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.914620 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a08abcad-85f1-431b-853e-3599eebed756-operator-scripts\") pod \"nova-cell0-db-create-62k9x\" (UID: \"a08abcad-85f1-431b-853e-3599eebed756\") " pod="openstack/nova-cell0-db-create-62k9x" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.914642 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-log-httpd\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.915917 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9cf579e-cb45-4984-8558-107b9576d977-operator-scripts\") pod \"nova-api-db-create-c5zqd\" (UID: \"d9cf579e-cb45-4984-8558-107b9576d977\") " pod="openstack/nova-api-db-create-c5zqd" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.984898 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn6nt\" (UniqueName: \"kubernetes.io/projected/d9cf579e-cb45-4984-8558-107b9576d977-kube-api-access-dn6nt\") pod \"nova-api-db-create-c5zqd\" (UID: \"d9cf579e-cb45-4984-8558-107b9576d977\") " pod="openstack/nova-api-db-create-c5zqd" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.992106 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-c5zqd" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.997828 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.015301 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-scripts\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.015337 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7zqp\" (UniqueName: \"kubernetes.io/projected/a08abcad-85f1-431b-853e-3599eebed756-kube-api-access-n7zqp\") pod \"nova-cell0-db-create-62k9x\" (UID: \"a08abcad-85f1-431b-853e-3599eebed756\") " pod="openstack/nova-cell0-db-create-62k9x" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.015358 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a08abcad-85f1-431b-853e-3599eebed756-operator-scripts\") pod \"nova-cell0-db-create-62k9x\" (UID: \"a08abcad-85f1-431b-853e-3599eebed756\") " pod="openstack/nova-cell0-db-create-62k9x" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.015372 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-log-httpd\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.015401 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af31d5e0-11e6-433b-a31e-bea14d7e5c95-operator-scripts\") pod \"nova-api-61ab-account-create-update-4mch7\" (UID: \"af31d5e0-11e6-433b-a31e-bea14d7e5c95\") " pod="openstack/nova-api-61ab-account-create-update-4mch7" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.015440 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj7h4\" (UniqueName: \"kubernetes.io/projected/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-kube-api-access-pj7h4\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.015467 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrp82\" (UniqueName: \"kubernetes.io/projected/af31d5e0-11e6-433b-a31e-bea14d7e5c95-kube-api-access-rrp82\") pod \"nova-api-61ab-account-create-update-4mch7\" (UID: \"af31d5e0-11e6-433b-a31e-bea14d7e5c95\") " pod="openstack/nova-api-61ab-account-create-update-4mch7" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.015503 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.015518 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-config-data\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.015540 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.015560 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-run-httpd\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.016070 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-run-httpd\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.017635 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a08abcad-85f1-431b-853e-3599eebed756-operator-scripts\") pod \"nova-cell0-db-create-62k9x\" (UID: \"a08abcad-85f1-431b-853e-3599eebed756\") " pod="openstack/nova-cell0-db-create-62k9x" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.017906 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-log-httpd\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.018338 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af31d5e0-11e6-433b-a31e-bea14d7e5c95-operator-scripts\") pod \"nova-api-61ab-account-create-update-4mch7\" (UID: \"af31d5e0-11e6-433b-a31e-bea14d7e5c95\") " pod="openstack/nova-api-61ab-account-create-update-4mch7" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.029648 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-config-data\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.034214 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.036543 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-scripts\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.042081 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-62k9x"] Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.046443 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.056212 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj7h4\" (UniqueName: \"kubernetes.io/projected/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-kube-api-access-pj7h4\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.071459 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrp82\" (UniqueName: \"kubernetes.io/projected/af31d5e0-11e6-433b-a31e-bea14d7e5c95-kube-api-access-rrp82\") pod \"nova-api-61ab-account-create-update-4mch7\" (UID: \"af31d5e0-11e6-433b-a31e-bea14d7e5c95\") " pod="openstack/nova-api-61ab-account-create-update-4mch7" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.075529 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-61ab-account-create-update-4mch7"] Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.080197 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7zqp\" (UniqueName: \"kubernetes.io/projected/a08abcad-85f1-431b-853e-3599eebed756-kube-api-access-n7zqp\") pod \"nova-cell0-db-create-62k9x\" (UID: \"a08abcad-85f1-431b-853e-3599eebed756\") " pod="openstack/nova-cell0-db-create-62k9x" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.132403 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-99kbn"] Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.139701 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-99kbn" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.158928 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-99kbn"] Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.206073 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.211839 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-61ab-account-create-update-4mch7" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.226010 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccgqn\" (UniqueName: \"kubernetes.io/projected/01f8146d-b3dd-48a4-b1a8-9fa590c0d808-kube-api-access-ccgqn\") pod \"nova-cell1-db-create-99kbn\" (UID: \"01f8146d-b3dd-48a4-b1a8-9fa590c0d808\") " pod="openstack/nova-cell1-db-create-99kbn" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.226086 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01f8146d-b3dd-48a4-b1a8-9fa590c0d808-operator-scripts\") pod \"nova-cell1-db-create-99kbn\" (UID: \"01f8146d-b3dd-48a4-b1a8-9fa590c0d808\") " pod="openstack/nova-cell1-db-create-99kbn" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.240012 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-62k9x" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.329110 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccgqn\" (UniqueName: \"kubernetes.io/projected/01f8146d-b3dd-48a4-b1a8-9fa590c0d808-kube-api-access-ccgqn\") pod \"nova-cell1-db-create-99kbn\" (UID: \"01f8146d-b3dd-48a4-b1a8-9fa590c0d808\") " pod="openstack/nova-cell1-db-create-99kbn" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.329178 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01f8146d-b3dd-48a4-b1a8-9fa590c0d808-operator-scripts\") pod \"nova-cell1-db-create-99kbn\" (UID: \"01f8146d-b3dd-48a4-b1a8-9fa590c0d808\") " pod="openstack/nova-cell1-db-create-99kbn" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.329924 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01f8146d-b3dd-48a4-b1a8-9fa590c0d808-operator-scripts\") pod \"nova-cell1-db-create-99kbn\" (UID: \"01f8146d-b3dd-48a4-b1a8-9fa590c0d808\") " pod="openstack/nova-cell1-db-create-99kbn" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.341094 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-d9b2-account-create-update-dld7b"] Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.342438 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d9b2-account-create-update-dld7b" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.344702 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.373543 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-d9b2-account-create-update-dld7b"] Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.431235 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sndb\" (UniqueName: \"kubernetes.io/projected/a12f971e-bd5e-4b60-9d28-06c786d852ae-kube-api-access-8sndb\") pod \"nova-cell0-d9b2-account-create-update-dld7b\" (UID: \"a12f971e-bd5e-4b60-9d28-06c786d852ae\") " pod="openstack/nova-cell0-d9b2-account-create-update-dld7b" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.431331 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a12f971e-bd5e-4b60-9d28-06c786d852ae-operator-scripts\") pod \"nova-cell0-d9b2-account-create-update-dld7b\" (UID: \"a12f971e-bd5e-4b60-9d28-06c786d852ae\") " pod="openstack/nova-cell0-d9b2-account-create-update-dld7b" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.432349 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccgqn\" (UniqueName: \"kubernetes.io/projected/01f8146d-b3dd-48a4-b1a8-9fa590c0d808-kube-api-access-ccgqn\") pod \"nova-cell1-db-create-99kbn\" (UID: \"01f8146d-b3dd-48a4-b1a8-9fa590c0d808\") " pod="openstack/nova-cell1-db-create-99kbn" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.485305 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-99kbn" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.532969 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sndb\" (UniqueName: \"kubernetes.io/projected/a12f971e-bd5e-4b60-9d28-06c786d852ae-kube-api-access-8sndb\") pod \"nova-cell0-d9b2-account-create-update-dld7b\" (UID: \"a12f971e-bd5e-4b60-9d28-06c786d852ae\") " pod="openstack/nova-cell0-d9b2-account-create-update-dld7b" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.533233 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a12f971e-bd5e-4b60-9d28-06c786d852ae-operator-scripts\") pod \"nova-cell0-d9b2-account-create-update-dld7b\" (UID: \"a12f971e-bd5e-4b60-9d28-06c786d852ae\") " pod="openstack/nova-cell0-d9b2-account-create-update-dld7b" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.533917 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a12f971e-bd5e-4b60-9d28-06c786d852ae-operator-scripts\") pod \"nova-cell0-d9b2-account-create-update-dld7b\" (UID: \"a12f971e-bd5e-4b60-9d28-06c786d852ae\") " pod="openstack/nova-cell0-d9b2-account-create-update-dld7b" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.587228 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sndb\" (UniqueName: \"kubernetes.io/projected/a12f971e-bd5e-4b60-9d28-06c786d852ae-kube-api-access-8sndb\") pod \"nova-cell0-d9b2-account-create-update-dld7b\" (UID: \"a12f971e-bd5e-4b60-9d28-06c786d852ae\") " pod="openstack/nova-cell0-d9b2-account-create-update-dld7b" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.646960 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-b472-account-create-update-cmqsp"] Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.648453 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b472-account-create-update-cmqsp" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.657900 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.659500 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-b472-account-create-update-cmqsp"] Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.751280 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68f753ae-130b-46ab-a544-e694a81b09b0" path="/var/lib/kubelet/pods/68f753ae-130b-46ab-a544-e694a81b09b0/volumes" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.764383 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d9b2-account-create-update-dld7b" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.849555 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvt66\" (UniqueName: \"kubernetes.io/projected/ad73ec2f-ba76-4451-8202-33403a41de12-kube-api-access-hvt66\") pod \"nova-cell1-b472-account-create-update-cmqsp\" (UID: \"ad73ec2f-ba76-4451-8202-33403a41de12\") " pod="openstack/nova-cell1-b472-account-create-update-cmqsp" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.850192 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad73ec2f-ba76-4451-8202-33403a41de12-operator-scripts\") pod \"nova-cell1-b472-account-create-update-cmqsp\" (UID: \"ad73ec2f-ba76-4451-8202-33403a41de12\") " pod="openstack/nova-cell1-b472-account-create-update-cmqsp" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.964233 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvt66\" (UniqueName: \"kubernetes.io/projected/ad73ec2f-ba76-4451-8202-33403a41de12-kube-api-access-hvt66\") pod \"nova-cell1-b472-account-create-update-cmqsp\" (UID: \"ad73ec2f-ba76-4451-8202-33403a41de12\") " pod="openstack/nova-cell1-b472-account-create-update-cmqsp" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.964312 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad73ec2f-ba76-4451-8202-33403a41de12-operator-scripts\") pod \"nova-cell1-b472-account-create-update-cmqsp\" (UID: \"ad73ec2f-ba76-4451-8202-33403a41de12\") " pod="openstack/nova-cell1-b472-account-create-update-cmqsp" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.965175 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad73ec2f-ba76-4451-8202-33403a41de12-operator-scripts\") pod \"nova-cell1-b472-account-create-update-cmqsp\" (UID: \"ad73ec2f-ba76-4451-8202-33403a41de12\") " pod="openstack/nova-cell1-b472-account-create-update-cmqsp" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.981580 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvt66\" (UniqueName: \"kubernetes.io/projected/ad73ec2f-ba76-4451-8202-33403a41de12-kube-api-access-hvt66\") pod \"nova-cell1-b472-account-create-update-cmqsp\" (UID: \"ad73ec2f-ba76-4451-8202-33403a41de12\") " pod="openstack/nova-cell1-b472-account-create-update-cmqsp" Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.041072 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b472-account-create-update-cmqsp" Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.074223 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-61ab-account-create-update-4mch7"] Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.084561 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-62k9x"] Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.299752 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-99kbn"] Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.323846 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-c5zqd"] Jan 21 14:49:27 crc kubenswrapper[4720]: W0121 14:49:27.334771 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9cf579e_cb45_4984_8558_107b9576d977.slice/crio-313b82cca25744f795b0c71fee7f82ab9b0c99cd93e8e71d3469f6dadb27c2f9 WatchSource:0}: Error finding container 313b82cca25744f795b0c71fee7f82ab9b0c99cd93e8e71d3469f6dadb27c2f9: Status 404 returned error can't find the container with id 313b82cca25744f795b0c71fee7f82ab9b0c99cd93e8e71d3469f6dadb27c2f9 Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.351499 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.588426 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-d9b2-account-create-update-dld7b"] Jan 21 14:49:27 crc kubenswrapper[4720]: W0121 14:49:27.714260 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad73ec2f_ba76_4451_8202_33403a41de12.slice/crio-9b391bb1b62fa1cdf1c4454a58d43c3f44f680eaa7e1eb55126bfb497cf706fd WatchSource:0}: Error finding container 9b391bb1b62fa1cdf1c4454a58d43c3f44f680eaa7e1eb55126bfb497cf706fd: Status 404 returned error can't find the container with id 9b391bb1b62fa1cdf1c4454a58d43c3f44f680eaa7e1eb55126bfb497cf706fd Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.716692 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-b472-account-create-update-cmqsp"] Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.835544 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-99kbn" event={"ID":"01f8146d-b3dd-48a4-b1a8-9fa590c0d808","Type":"ContainerStarted","Data":"3c91133a01a4614de36a8d666a7d07c7ef46c013dcc30aab91a584e4c3f9d821"} Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.835582 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-99kbn" event={"ID":"01f8146d-b3dd-48a4-b1a8-9fa590c0d808","Type":"ContainerStarted","Data":"e28517e6ac3c028cc290c856ceb7d45cb3555aba34ea47c5d62a6fa80bb94aed"} Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.848139 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-99kbn" podStartSLOduration=1.84812102 podStartE2EDuration="1.84812102s" podCreationTimestamp="2026-01-21 14:49:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:49:27.847735641 +0000 UTC m=+1205.756475573" watchObservedRunningTime="2026-01-21 14:49:27.84812102 +0000 UTC m=+1205.756860952" Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.853692 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-62k9x" event={"ID":"a08abcad-85f1-431b-853e-3599eebed756","Type":"ContainerStarted","Data":"582c2f5a67c5087ceb2090b4f845673a61d252b5b4bb8a1030a72f2c63755ab3"} Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.853738 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-62k9x" event={"ID":"a08abcad-85f1-431b-853e-3599eebed756","Type":"ContainerStarted","Data":"fe1617177061917c89b74484b60aa11dbd8cdee5da46170e9e019a43335bb85b"} Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.862308 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b472-account-create-update-cmqsp" event={"ID":"ad73ec2f-ba76-4451-8202-33403a41de12","Type":"ContainerStarted","Data":"9b391bb1b62fa1cdf1c4454a58d43c3f44f680eaa7e1eb55126bfb497cf706fd"} Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.866393 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-62k9x" podStartSLOduration=2.866385072 podStartE2EDuration="2.866385072s" podCreationTimestamp="2026-01-21 14:49:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:49:27.865973512 +0000 UTC m=+1205.774713444" watchObservedRunningTime="2026-01-21 14:49:27.866385072 +0000 UTC m=+1205.775125004" Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.876410 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d9b2-account-create-update-dld7b" event={"ID":"a12f971e-bd5e-4b60-9d28-06c786d852ae","Type":"ContainerStarted","Data":"76e56b7b117cebd65fc0e8a56b27da7c2b84bd042ac8ed9b2babbcdfb78864a1"} Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.876457 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d9b2-account-create-update-dld7b" event={"ID":"a12f971e-bd5e-4b60-9d28-06c786d852ae","Type":"ContainerStarted","Data":"ba9e78a7f4a1867d0489a2a47d2924a5a5ff066089831878611ea21481537a78"} Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.889948 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e4281fdf-eb56-41e8-a750-13ee7ac37bea","Type":"ContainerStarted","Data":"a3d07e8f53a4709678b7f961476ba1888c41c2fe302f5ad0101a7fc048065db3"} Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.890613 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.904591 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-d9b2-account-create-update-dld7b" podStartSLOduration=1.904567248 podStartE2EDuration="1.904567248s" podCreationTimestamp="2026-01-21 14:49:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:49:27.902884225 +0000 UTC m=+1205.811624157" watchObservedRunningTime="2026-01-21 14:49:27.904567248 +0000 UTC m=+1205.813307180" Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.909032 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f","Type":"ContainerStarted","Data":"d769c49ed2fe68686c374a2f8612b148bed4023ed4696f58a59cd9bf88586865"} Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.916896 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-61ab-account-create-update-4mch7" event={"ID":"af31d5e0-11e6-433b-a31e-bea14d7e5c95","Type":"ContainerStarted","Data":"72ca4e3efda677c6d5505c06f76f801874dedd82499a86395269317817d91b41"} Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.916946 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-61ab-account-create-update-4mch7" event={"ID":"af31d5e0-11e6-433b-a31e-bea14d7e5c95","Type":"ContainerStarted","Data":"d184762624d55441d829d3557584647905074487e8ec82564ca9757072387f85"} Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.918776 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-c5zqd" event={"ID":"d9cf579e-cb45-4984-8558-107b9576d977","Type":"ContainerStarted","Data":"640739b09d2283081f0c3b2a06de0e2de45e7dd328c1f454ca1fe542c003fad9"} Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.918795 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-c5zqd" event={"ID":"d9cf579e-cb45-4984-8558-107b9576d977","Type":"ContainerStarted","Data":"313b82cca25744f795b0c71fee7f82ab9b0c99cd93e8e71d3469f6dadb27c2f9"} Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.926150 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.926134805 podStartE2EDuration="4.926134805s" podCreationTimestamp="2026-01-21 14:49:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:49:27.923327002 +0000 UTC m=+1205.832066964" watchObservedRunningTime="2026-01-21 14:49:27.926134805 +0000 UTC m=+1205.834874737" Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.951835 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-c5zqd" podStartSLOduration=2.9518162869999998 podStartE2EDuration="2.951816287s" podCreationTimestamp="2026-01-21 14:49:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:49:27.940432894 +0000 UTC m=+1205.849172836" watchObservedRunningTime="2026-01-21 14:49:27.951816287 +0000 UTC m=+1205.860556219" Jan 21 14:49:28 crc kubenswrapper[4720]: I0121 14:49:28.942035 4720 generic.go:334] "Generic (PLEG): container finished" podID="a08abcad-85f1-431b-853e-3599eebed756" containerID="582c2f5a67c5087ceb2090b4f845673a61d252b5b4bb8a1030a72f2c63755ab3" exitCode=0 Jan 21 14:49:28 crc kubenswrapper[4720]: I0121 14:49:28.942130 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-62k9x" event={"ID":"a08abcad-85f1-431b-853e-3599eebed756","Type":"ContainerDied","Data":"582c2f5a67c5087ceb2090b4f845673a61d252b5b4bb8a1030a72f2c63755ab3"} Jan 21 14:49:28 crc kubenswrapper[4720]: I0121 14:49:28.944325 4720 generic.go:334] "Generic (PLEG): container finished" podID="ad73ec2f-ba76-4451-8202-33403a41de12" containerID="8c3eb39f9b9627b072a3900c90555cd68e5d7daab86658e513ca3c054e6b4044" exitCode=0 Jan 21 14:49:28 crc kubenswrapper[4720]: I0121 14:49:28.944384 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b472-account-create-update-cmqsp" event={"ID":"ad73ec2f-ba76-4451-8202-33403a41de12","Type":"ContainerDied","Data":"8c3eb39f9b9627b072a3900c90555cd68e5d7daab86658e513ca3c054e6b4044"} Jan 21 14:49:28 crc kubenswrapper[4720]: I0121 14:49:28.946553 4720 generic.go:334] "Generic (PLEG): container finished" podID="a12f971e-bd5e-4b60-9d28-06c786d852ae" containerID="76e56b7b117cebd65fc0e8a56b27da7c2b84bd042ac8ed9b2babbcdfb78864a1" exitCode=0 Jan 21 14:49:28 crc kubenswrapper[4720]: I0121 14:49:28.946609 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d9b2-account-create-update-dld7b" event={"ID":"a12f971e-bd5e-4b60-9d28-06c786d852ae","Type":"ContainerDied","Data":"76e56b7b117cebd65fc0e8a56b27da7c2b84bd042ac8ed9b2babbcdfb78864a1"} Jan 21 14:49:28 crc kubenswrapper[4720]: I0121 14:49:28.948592 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f","Type":"ContainerStarted","Data":"7f7e4c6d468593daf229306bc581366e165d475f87a2f54258d6febc1dbaad79"} Jan 21 14:49:28 crc kubenswrapper[4720]: I0121 14:49:28.949640 4720 generic.go:334] "Generic (PLEG): container finished" podID="af31d5e0-11e6-433b-a31e-bea14d7e5c95" containerID="72ca4e3efda677c6d5505c06f76f801874dedd82499a86395269317817d91b41" exitCode=0 Jan 21 14:49:28 crc kubenswrapper[4720]: I0121 14:49:28.949694 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-61ab-account-create-update-4mch7" event={"ID":"af31d5e0-11e6-433b-a31e-bea14d7e5c95","Type":"ContainerDied","Data":"72ca4e3efda677c6d5505c06f76f801874dedd82499a86395269317817d91b41"} Jan 21 14:49:28 crc kubenswrapper[4720]: I0121 14:49:28.951618 4720 generic.go:334] "Generic (PLEG): container finished" podID="d9cf579e-cb45-4984-8558-107b9576d977" containerID="640739b09d2283081f0c3b2a06de0e2de45e7dd328c1f454ca1fe542c003fad9" exitCode=0 Jan 21 14:49:28 crc kubenswrapper[4720]: I0121 14:49:28.951702 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-c5zqd" event={"ID":"d9cf579e-cb45-4984-8558-107b9576d977","Type":"ContainerDied","Data":"640739b09d2283081f0c3b2a06de0e2de45e7dd328c1f454ca1fe542c003fad9"} Jan 21 14:49:28 crc kubenswrapper[4720]: I0121 14:49:28.953628 4720 generic.go:334] "Generic (PLEG): container finished" podID="01f8146d-b3dd-48a4-b1a8-9fa590c0d808" containerID="3c91133a01a4614de36a8d666a7d07c7ef46c013dcc30aab91a584e4c3f9d821" exitCode=0 Jan 21 14:49:28 crc kubenswrapper[4720]: I0121 14:49:28.953743 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-99kbn" event={"ID":"01f8146d-b3dd-48a4-b1a8-9fa590c0d808","Type":"ContainerDied","Data":"3c91133a01a4614de36a8d666a7d07c7ef46c013dcc30aab91a584e4c3f9d821"} Jan 21 14:49:28 crc kubenswrapper[4720]: I0121 14:49:28.958255 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-61ab-account-create-update-4mch7" podStartSLOduration=3.958239609 podStartE2EDuration="3.958239609s" podCreationTimestamp="2026-01-21 14:49:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:49:27.96120703 +0000 UTC m=+1205.869946972" watchObservedRunningTime="2026-01-21 14:49:28.958239609 +0000 UTC m=+1206.866979541" Jan 21 14:49:29 crc kubenswrapper[4720]: I0121 14:49:29.963617 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f","Type":"ContainerStarted","Data":"1364655a728c1356bdefce12e2d0e44573b775d14960ce98440245b79d150bcb"} Jan 21 14:49:30 crc kubenswrapper[4720]: I0121 14:49:30.976895 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-99kbn" event={"ID":"01f8146d-b3dd-48a4-b1a8-9fa590c0d808","Type":"ContainerDied","Data":"e28517e6ac3c028cc290c856ceb7d45cb3555aba34ea47c5d62a6fa80bb94aed"} Jan 21 14:49:30 crc kubenswrapper[4720]: I0121 14:49:30.977198 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e28517e6ac3c028cc290c856ceb7d45cb3555aba34ea47c5d62a6fa80bb94aed" Jan 21 14:49:30 crc kubenswrapper[4720]: I0121 14:49:30.980221 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-62k9x" event={"ID":"a08abcad-85f1-431b-853e-3599eebed756","Type":"ContainerDied","Data":"fe1617177061917c89b74484b60aa11dbd8cdee5da46170e9e019a43335bb85b"} Jan 21 14:49:30 crc kubenswrapper[4720]: I0121 14:49:30.980252 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe1617177061917c89b74484b60aa11dbd8cdee5da46170e9e019a43335bb85b" Jan 21 14:49:30 crc kubenswrapper[4720]: I0121 14:49:30.981718 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b472-account-create-update-cmqsp" event={"ID":"ad73ec2f-ba76-4451-8202-33403a41de12","Type":"ContainerDied","Data":"9b391bb1b62fa1cdf1c4454a58d43c3f44f680eaa7e1eb55126bfb497cf706fd"} Jan 21 14:49:30 crc kubenswrapper[4720]: I0121 14:49:30.981739 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b391bb1b62fa1cdf1c4454a58d43c3f44f680eaa7e1eb55126bfb497cf706fd" Jan 21 14:49:30 crc kubenswrapper[4720]: I0121 14:49:30.983228 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d9b2-account-create-update-dld7b" event={"ID":"a12f971e-bd5e-4b60-9d28-06c786d852ae","Type":"ContainerDied","Data":"ba9e78a7f4a1867d0489a2a47d2924a5a5ff066089831878611ea21481537a78"} Jan 21 14:49:30 crc kubenswrapper[4720]: I0121 14:49:30.983245 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba9e78a7f4a1867d0489a2a47d2924a5a5ff066089831878611ea21481537a78" Jan 21 14:49:30 crc kubenswrapper[4720]: I0121 14:49:30.985142 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f","Type":"ContainerStarted","Data":"8d16c2d9a5339da420c5910e4518fd27d97a69b6f265da84227c24dca04de7ac"} Jan 21 14:49:30 crc kubenswrapper[4720]: I0121 14:49:30.988105 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-61ab-account-create-update-4mch7" event={"ID":"af31d5e0-11e6-433b-a31e-bea14d7e5c95","Type":"ContainerDied","Data":"d184762624d55441d829d3557584647905074487e8ec82564ca9757072387f85"} Jan 21 14:49:30 crc kubenswrapper[4720]: I0121 14:49:30.988147 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d184762624d55441d829d3557584647905074487e8ec82564ca9757072387f85" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.001789 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-62k9x" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.008322 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b472-account-create-update-cmqsp" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.014300 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-99kbn" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.020184 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d9b2-account-create-update-dld7b" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.028920 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-61ab-account-create-update-4mch7" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.032042 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-c5zqd" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.095171 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrp82\" (UniqueName: \"kubernetes.io/projected/af31d5e0-11e6-433b-a31e-bea14d7e5c95-kube-api-access-rrp82\") pod \"af31d5e0-11e6-433b-a31e-bea14d7e5c95\" (UID: \"af31d5e0-11e6-433b-a31e-bea14d7e5c95\") " Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.095298 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a12f971e-bd5e-4b60-9d28-06c786d852ae-operator-scripts\") pod \"a12f971e-bd5e-4b60-9d28-06c786d852ae\" (UID: \"a12f971e-bd5e-4b60-9d28-06c786d852ae\") " Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.095324 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a08abcad-85f1-431b-853e-3599eebed756-operator-scripts\") pod \"a08abcad-85f1-431b-853e-3599eebed756\" (UID: \"a08abcad-85f1-431b-853e-3599eebed756\") " Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.095363 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af31d5e0-11e6-433b-a31e-bea14d7e5c95-operator-scripts\") pod \"af31d5e0-11e6-433b-a31e-bea14d7e5c95\" (UID: \"af31d5e0-11e6-433b-a31e-bea14d7e5c95\") " Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.095412 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad73ec2f-ba76-4451-8202-33403a41de12-operator-scripts\") pod \"ad73ec2f-ba76-4451-8202-33403a41de12\" (UID: \"ad73ec2f-ba76-4451-8202-33403a41de12\") " Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.095456 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccgqn\" (UniqueName: \"kubernetes.io/projected/01f8146d-b3dd-48a4-b1a8-9fa590c0d808-kube-api-access-ccgqn\") pod \"01f8146d-b3dd-48a4-b1a8-9fa590c0d808\" (UID: \"01f8146d-b3dd-48a4-b1a8-9fa590c0d808\") " Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.095507 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01f8146d-b3dd-48a4-b1a8-9fa590c0d808-operator-scripts\") pod \"01f8146d-b3dd-48a4-b1a8-9fa590c0d808\" (UID: \"01f8146d-b3dd-48a4-b1a8-9fa590c0d808\") " Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.095553 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dn6nt\" (UniqueName: \"kubernetes.io/projected/d9cf579e-cb45-4984-8558-107b9576d977-kube-api-access-dn6nt\") pod \"d9cf579e-cb45-4984-8558-107b9576d977\" (UID: \"d9cf579e-cb45-4984-8558-107b9576d977\") " Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.095580 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sndb\" (UniqueName: \"kubernetes.io/projected/a12f971e-bd5e-4b60-9d28-06c786d852ae-kube-api-access-8sndb\") pod \"a12f971e-bd5e-4b60-9d28-06c786d852ae\" (UID: \"a12f971e-bd5e-4b60-9d28-06c786d852ae\") " Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.095647 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9cf579e-cb45-4984-8558-107b9576d977-operator-scripts\") pod \"d9cf579e-cb45-4984-8558-107b9576d977\" (UID: \"d9cf579e-cb45-4984-8558-107b9576d977\") " Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.095680 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvt66\" (UniqueName: \"kubernetes.io/projected/ad73ec2f-ba76-4451-8202-33403a41de12-kube-api-access-hvt66\") pod \"ad73ec2f-ba76-4451-8202-33403a41de12\" (UID: \"ad73ec2f-ba76-4451-8202-33403a41de12\") " Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.095698 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7zqp\" (UniqueName: \"kubernetes.io/projected/a08abcad-85f1-431b-853e-3599eebed756-kube-api-access-n7zqp\") pod \"a08abcad-85f1-431b-853e-3599eebed756\" (UID: \"a08abcad-85f1-431b-853e-3599eebed756\") " Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.096074 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a12f971e-bd5e-4b60-9d28-06c786d852ae-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a12f971e-bd5e-4b60-9d28-06c786d852ae" (UID: "a12f971e-bd5e-4b60-9d28-06c786d852ae"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.096157 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af31d5e0-11e6-433b-a31e-bea14d7e5c95-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "af31d5e0-11e6-433b-a31e-bea14d7e5c95" (UID: "af31d5e0-11e6-433b-a31e-bea14d7e5c95"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.096411 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a08abcad-85f1-431b-853e-3599eebed756-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a08abcad-85f1-431b-853e-3599eebed756" (UID: "a08abcad-85f1-431b-853e-3599eebed756"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.096685 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad73ec2f-ba76-4451-8202-33403a41de12-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ad73ec2f-ba76-4451-8202-33403a41de12" (UID: "ad73ec2f-ba76-4451-8202-33403a41de12"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.097116 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9cf579e-cb45-4984-8558-107b9576d977-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d9cf579e-cb45-4984-8558-107b9576d977" (UID: "d9cf579e-cb45-4984-8558-107b9576d977"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.097399 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01f8146d-b3dd-48a4-b1a8-9fa590c0d808-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "01f8146d-b3dd-48a4-b1a8-9fa590c0d808" (UID: "01f8146d-b3dd-48a4-b1a8-9fa590c0d808"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.103618 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9cf579e-cb45-4984-8558-107b9576d977-kube-api-access-dn6nt" (OuterVolumeSpecName: "kube-api-access-dn6nt") pod "d9cf579e-cb45-4984-8558-107b9576d977" (UID: "d9cf579e-cb45-4984-8558-107b9576d977"). InnerVolumeSpecName "kube-api-access-dn6nt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.105218 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a08abcad-85f1-431b-853e-3599eebed756-kube-api-access-n7zqp" (OuterVolumeSpecName: "kube-api-access-n7zqp") pod "a08abcad-85f1-431b-853e-3599eebed756" (UID: "a08abcad-85f1-431b-853e-3599eebed756"). InnerVolumeSpecName "kube-api-access-n7zqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.105850 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af31d5e0-11e6-433b-a31e-bea14d7e5c95-kube-api-access-rrp82" (OuterVolumeSpecName: "kube-api-access-rrp82") pod "af31d5e0-11e6-433b-a31e-bea14d7e5c95" (UID: "af31d5e0-11e6-433b-a31e-bea14d7e5c95"). InnerVolumeSpecName "kube-api-access-rrp82". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.107218 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a12f971e-bd5e-4b60-9d28-06c786d852ae-kube-api-access-8sndb" (OuterVolumeSpecName: "kube-api-access-8sndb") pod "a12f971e-bd5e-4b60-9d28-06c786d852ae" (UID: "a12f971e-bd5e-4b60-9d28-06c786d852ae"). InnerVolumeSpecName "kube-api-access-8sndb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.107266 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01f8146d-b3dd-48a4-b1a8-9fa590c0d808-kube-api-access-ccgqn" (OuterVolumeSpecName: "kube-api-access-ccgqn") pod "01f8146d-b3dd-48a4-b1a8-9fa590c0d808" (UID: "01f8146d-b3dd-48a4-b1a8-9fa590c0d808"). InnerVolumeSpecName "kube-api-access-ccgqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.109825 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad73ec2f-ba76-4451-8202-33403a41de12-kube-api-access-hvt66" (OuterVolumeSpecName: "kube-api-access-hvt66") pod "ad73ec2f-ba76-4451-8202-33403a41de12" (UID: "ad73ec2f-ba76-4451-8202-33403a41de12"). InnerVolumeSpecName "kube-api-access-hvt66". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.197730 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrp82\" (UniqueName: \"kubernetes.io/projected/af31d5e0-11e6-433b-a31e-bea14d7e5c95-kube-api-access-rrp82\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.197762 4720 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a12f971e-bd5e-4b60-9d28-06c786d852ae-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.197771 4720 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a08abcad-85f1-431b-853e-3599eebed756-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.197780 4720 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af31d5e0-11e6-433b-a31e-bea14d7e5c95-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.197791 4720 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad73ec2f-ba76-4451-8202-33403a41de12-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.197799 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccgqn\" (UniqueName: \"kubernetes.io/projected/01f8146d-b3dd-48a4-b1a8-9fa590c0d808-kube-api-access-ccgqn\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.197808 4720 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01f8146d-b3dd-48a4-b1a8-9fa590c0d808-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.197817 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dn6nt\" (UniqueName: \"kubernetes.io/projected/d9cf579e-cb45-4984-8558-107b9576d977-kube-api-access-dn6nt\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.197825 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sndb\" (UniqueName: \"kubernetes.io/projected/a12f971e-bd5e-4b60-9d28-06c786d852ae-kube-api-access-8sndb\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.197834 4720 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9cf579e-cb45-4984-8558-107b9576d977-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.197842 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvt66\" (UniqueName: \"kubernetes.io/projected/ad73ec2f-ba76-4451-8202-33403a41de12-kube-api-access-hvt66\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.197850 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7zqp\" (UniqueName: \"kubernetes.io/projected/a08abcad-85f1-431b-853e-3599eebed756-kube-api-access-n7zqp\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:32 crc kubenswrapper[4720]: I0121 14:49:32.005642 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-62k9x" Jan 21 14:49:32 crc kubenswrapper[4720]: I0121 14:49:32.005668 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-c5zqd" Jan 21 14:49:32 crc kubenswrapper[4720]: I0121 14:49:32.005691 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d9b2-account-create-update-dld7b" Jan 21 14:49:32 crc kubenswrapper[4720]: I0121 14:49:32.005709 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-c5zqd" event={"ID":"d9cf579e-cb45-4984-8558-107b9576d977","Type":"ContainerDied","Data":"313b82cca25744f795b0c71fee7f82ab9b0c99cd93e8e71d3469f6dadb27c2f9"} Jan 21 14:49:32 crc kubenswrapper[4720]: I0121 14:49:32.005644 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b472-account-create-update-cmqsp" Jan 21 14:49:32 crc kubenswrapper[4720]: I0121 14:49:32.005762 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-99kbn" Jan 21 14:49:32 crc kubenswrapper[4720]: I0121 14:49:32.006518 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-61ab-account-create-update-4mch7" Jan 21 14:49:32 crc kubenswrapper[4720]: I0121 14:49:32.005736 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="313b82cca25744f795b0c71fee7f82ab9b0c99cd93e8e71d3469f6dadb27c2f9" Jan 21 14:49:33 crc kubenswrapper[4720]: I0121 14:49:33.016109 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f","Type":"ContainerStarted","Data":"8b1da56d82bf0243680a9e8d3aa292d2a180e54b1f7678c12089ef984113676e"} Jan 21 14:49:33 crc kubenswrapper[4720]: I0121 14:49:33.016586 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 21 14:49:33 crc kubenswrapper[4720]: I0121 14:49:33.042519 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.692672333 podStartE2EDuration="8.042479547s" podCreationTimestamp="2026-01-21 14:49:25 +0000 UTC" firstStartedPulling="2026-01-21 14:49:27.396513422 +0000 UTC m=+1205.305253344" lastFinishedPulling="2026-01-21 14:49:31.746320626 +0000 UTC m=+1209.655060558" observedRunningTime="2026-01-21 14:49:33.042025236 +0000 UTC m=+1210.950765178" watchObservedRunningTime="2026-01-21 14:49:33.042479547 +0000 UTC m=+1210.951219499" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.347886 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.673766 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-vm954"] Jan 21 14:49:36 crc kubenswrapper[4720]: E0121 14:49:36.674148 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af31d5e0-11e6-433b-a31e-bea14d7e5c95" containerName="mariadb-account-create-update" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.674166 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="af31d5e0-11e6-433b-a31e-bea14d7e5c95" containerName="mariadb-account-create-update" Jan 21 14:49:36 crc kubenswrapper[4720]: E0121 14:49:36.674178 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad73ec2f-ba76-4451-8202-33403a41de12" containerName="mariadb-account-create-update" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.674185 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad73ec2f-ba76-4451-8202-33403a41de12" containerName="mariadb-account-create-update" Jan 21 14:49:36 crc kubenswrapper[4720]: E0121 14:49:36.674196 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a08abcad-85f1-431b-853e-3599eebed756" containerName="mariadb-database-create" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.674202 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="a08abcad-85f1-431b-853e-3599eebed756" containerName="mariadb-database-create" Jan 21 14:49:36 crc kubenswrapper[4720]: E0121 14:49:36.674211 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9cf579e-cb45-4984-8558-107b9576d977" containerName="mariadb-database-create" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.674217 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9cf579e-cb45-4984-8558-107b9576d977" containerName="mariadb-database-create" Jan 21 14:49:36 crc kubenswrapper[4720]: E0121 14:49:36.674227 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01f8146d-b3dd-48a4-b1a8-9fa590c0d808" containerName="mariadb-database-create" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.674233 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="01f8146d-b3dd-48a4-b1a8-9fa590c0d808" containerName="mariadb-database-create" Jan 21 14:49:36 crc kubenswrapper[4720]: E0121 14:49:36.674250 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a12f971e-bd5e-4b60-9d28-06c786d852ae" containerName="mariadb-account-create-update" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.674255 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="a12f971e-bd5e-4b60-9d28-06c786d852ae" containerName="mariadb-account-create-update" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.674411 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9cf579e-cb45-4984-8558-107b9576d977" containerName="mariadb-database-create" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.674431 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="01f8146d-b3dd-48a4-b1a8-9fa590c0d808" containerName="mariadb-database-create" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.674440 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad73ec2f-ba76-4451-8202-33403a41de12" containerName="mariadb-account-create-update" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.674450 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="a12f971e-bd5e-4b60-9d28-06c786d852ae" containerName="mariadb-account-create-update" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.674464 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="a08abcad-85f1-431b-853e-3599eebed756" containerName="mariadb-database-create" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.674478 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="af31d5e0-11e6-433b-a31e-bea14d7e5c95" containerName="mariadb-account-create-update" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.674988 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-vm954" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.681464 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.681754 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.681896 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-pqrwq" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.692013 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-vm954"] Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.701699 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dda8050-939a-4a64-b119-b718b60c7887-config-data\") pod \"nova-cell0-conductor-db-sync-vm954\" (UID: \"4dda8050-939a-4a64-b119-b718b60c7887\") " pod="openstack/nova-cell0-conductor-db-sync-vm954" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.701757 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dda8050-939a-4a64-b119-b718b60c7887-scripts\") pod \"nova-cell0-conductor-db-sync-vm954\" (UID: \"4dda8050-939a-4a64-b119-b718b60c7887\") " pod="openstack/nova-cell0-conductor-db-sync-vm954" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.701853 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59cmz\" (UniqueName: \"kubernetes.io/projected/4dda8050-939a-4a64-b119-b718b60c7887-kube-api-access-59cmz\") pod \"nova-cell0-conductor-db-sync-vm954\" (UID: \"4dda8050-939a-4a64-b119-b718b60c7887\") " pod="openstack/nova-cell0-conductor-db-sync-vm954" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.701886 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dda8050-939a-4a64-b119-b718b60c7887-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-vm954\" (UID: \"4dda8050-939a-4a64-b119-b718b60c7887\") " pod="openstack/nova-cell0-conductor-db-sync-vm954" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.804070 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59cmz\" (UniqueName: \"kubernetes.io/projected/4dda8050-939a-4a64-b119-b718b60c7887-kube-api-access-59cmz\") pod \"nova-cell0-conductor-db-sync-vm954\" (UID: \"4dda8050-939a-4a64-b119-b718b60c7887\") " pod="openstack/nova-cell0-conductor-db-sync-vm954" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.804145 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dda8050-939a-4a64-b119-b718b60c7887-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-vm954\" (UID: \"4dda8050-939a-4a64-b119-b718b60c7887\") " pod="openstack/nova-cell0-conductor-db-sync-vm954" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.804204 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dda8050-939a-4a64-b119-b718b60c7887-config-data\") pod \"nova-cell0-conductor-db-sync-vm954\" (UID: \"4dda8050-939a-4a64-b119-b718b60c7887\") " pod="openstack/nova-cell0-conductor-db-sync-vm954" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.804244 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dda8050-939a-4a64-b119-b718b60c7887-scripts\") pod \"nova-cell0-conductor-db-sync-vm954\" (UID: \"4dda8050-939a-4a64-b119-b718b60c7887\") " pod="openstack/nova-cell0-conductor-db-sync-vm954" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.810727 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dda8050-939a-4a64-b119-b718b60c7887-scripts\") pod \"nova-cell0-conductor-db-sync-vm954\" (UID: \"4dda8050-939a-4a64-b119-b718b60c7887\") " pod="openstack/nova-cell0-conductor-db-sync-vm954" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.810811 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dda8050-939a-4a64-b119-b718b60c7887-config-data\") pod \"nova-cell0-conductor-db-sync-vm954\" (UID: \"4dda8050-939a-4a64-b119-b718b60c7887\") " pod="openstack/nova-cell0-conductor-db-sync-vm954" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.823313 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dda8050-939a-4a64-b119-b718b60c7887-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-vm954\" (UID: \"4dda8050-939a-4a64-b119-b718b60c7887\") " pod="openstack/nova-cell0-conductor-db-sync-vm954" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.835999 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59cmz\" (UniqueName: \"kubernetes.io/projected/4dda8050-939a-4a64-b119-b718b60c7887-kube-api-access-59cmz\") pod \"nova-cell0-conductor-db-sync-vm954\" (UID: \"4dda8050-939a-4a64-b119-b718b60c7887\") " pod="openstack/nova-cell0-conductor-db-sync-vm954" Jan 21 14:49:37 crc kubenswrapper[4720]: I0121 14:49:37.000982 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-vm954" Jan 21 14:49:37 crc kubenswrapper[4720]: I0121 14:49:37.509146 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-vm954"] Jan 21 14:49:38 crc kubenswrapper[4720]: I0121 14:49:38.065100 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-vm954" event={"ID":"4dda8050-939a-4a64-b119-b718b60c7887","Type":"ContainerStarted","Data":"09ea94304001bcd335265fbc7a965a41b7b4d379c78f211cc4b029153d465e49"} Jan 21 14:49:41 crc kubenswrapper[4720]: I0121 14:49:41.153958 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:49:41 crc kubenswrapper[4720]: I0121 14:49:41.154603 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerName="ceilometer-central-agent" containerID="cri-o://7f7e4c6d468593daf229306bc581366e165d475f87a2f54258d6febc1dbaad79" gracePeriod=30 Jan 21 14:49:41 crc kubenswrapper[4720]: I0121 14:49:41.154707 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerName="sg-core" containerID="cri-o://8d16c2d9a5339da420c5910e4518fd27d97a69b6f265da84227c24dca04de7ac" gracePeriod=30 Jan 21 14:49:41 crc kubenswrapper[4720]: I0121 14:49:41.154736 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerName="proxy-httpd" containerID="cri-o://8b1da56d82bf0243680a9e8d3aa292d2a180e54b1f7678c12089ef984113676e" gracePeriod=30 Jan 21 14:49:41 crc kubenswrapper[4720]: I0121 14:49:41.154832 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerName="ceilometer-notification-agent" containerID="cri-o://1364655a728c1356bdefce12e2d0e44573b775d14960ce98440245b79d150bcb" gracePeriod=30 Jan 21 14:49:41 crc kubenswrapper[4720]: I0121 14:49:41.182647 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.159:3000/\": EOF" Jan 21 14:49:42 crc kubenswrapper[4720]: I0121 14:49:42.123782 4720 generic.go:334] "Generic (PLEG): container finished" podID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerID="8b1da56d82bf0243680a9e8d3aa292d2a180e54b1f7678c12089ef984113676e" exitCode=0 Jan 21 14:49:42 crc kubenswrapper[4720]: I0121 14:49:42.124081 4720 generic.go:334] "Generic (PLEG): container finished" podID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerID="8d16c2d9a5339da420c5910e4518fd27d97a69b6f265da84227c24dca04de7ac" exitCode=2 Jan 21 14:49:42 crc kubenswrapper[4720]: I0121 14:49:42.124093 4720 generic.go:334] "Generic (PLEG): container finished" podID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerID="7f7e4c6d468593daf229306bc581366e165d475f87a2f54258d6febc1dbaad79" exitCode=0 Jan 21 14:49:42 crc kubenswrapper[4720]: I0121 14:49:42.123868 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f","Type":"ContainerDied","Data":"8b1da56d82bf0243680a9e8d3aa292d2a180e54b1f7678c12089ef984113676e"} Jan 21 14:49:42 crc kubenswrapper[4720]: I0121 14:49:42.124131 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f","Type":"ContainerDied","Data":"8d16c2d9a5339da420c5910e4518fd27d97a69b6f265da84227c24dca04de7ac"} Jan 21 14:49:42 crc kubenswrapper[4720]: I0121 14:49:42.124148 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f","Type":"ContainerDied","Data":"7f7e4c6d468593daf229306bc581366e165d475f87a2f54258d6febc1dbaad79"} Jan 21 14:49:43 crc kubenswrapper[4720]: I0121 14:49:43.140427 4720 generic.go:334] "Generic (PLEG): container finished" podID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerID="1364655a728c1356bdefce12e2d0e44573b775d14960ce98440245b79d150bcb" exitCode=0 Jan 21 14:49:43 crc kubenswrapper[4720]: I0121 14:49:43.140489 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f","Type":"ContainerDied","Data":"1364655a728c1356bdefce12e2d0e44573b775d14960ce98440245b79d150bcb"} Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.051503 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.096250 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-log-httpd\") pod \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.096294 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-combined-ca-bundle\") pod \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.096339 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj7h4\" (UniqueName: \"kubernetes.io/projected/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-kube-api-access-pj7h4\") pod \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.096380 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-run-httpd\") pod \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.096417 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-scripts\") pod \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.096460 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-config-data\") pod \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.096495 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-sg-core-conf-yaml\") pod \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.098216 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" (UID: "a45acaac-b9b3-4ca7-9aca-7c4e67ec145f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.098611 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" (UID: "a45acaac-b9b3-4ca7-9aca-7c4e67ec145f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.109368 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-kube-api-access-pj7h4" (OuterVolumeSpecName: "kube-api-access-pj7h4") pod "a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" (UID: "a45acaac-b9b3-4ca7-9aca-7c4e67ec145f"). InnerVolumeSpecName "kube-api-access-pj7h4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.113860 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-scripts" (OuterVolumeSpecName: "scripts") pod "a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" (UID: "a45acaac-b9b3-4ca7-9aca-7c4e67ec145f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.200018 4720 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.200296 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj7h4\" (UniqueName: \"kubernetes.io/projected/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-kube-api-access-pj7h4\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.200366 4720 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.200437 4720 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.209521 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" (UID: "a45acaac-b9b3-4ca7-9aca-7c4e67ec145f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.229823 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-config-data" (OuterVolumeSpecName: "config-data") pod "a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" (UID: "a45acaac-b9b3-4ca7-9aca-7c4e67ec145f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.230223 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" (UID: "a45acaac-b9b3-4ca7-9aca-7c4e67ec145f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.237735 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f","Type":"ContainerDied","Data":"d769c49ed2fe68686c374a2f8612b148bed4023ed4696f58a59cd9bf88586865"} Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.237778 4720 scope.go:117] "RemoveContainer" containerID="8b1da56d82bf0243680a9e8d3aa292d2a180e54b1f7678c12089ef984113676e" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.237780 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.270498 4720 scope.go:117] "RemoveContainer" containerID="8d16c2d9a5339da420c5910e4518fd27d97a69b6f265da84227c24dca04de7ac" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.297536 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.297818 4720 scope.go:117] "RemoveContainer" containerID="1364655a728c1356bdefce12e2d0e44573b775d14960ce98440245b79d150bcb" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.302280 4720 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.302306 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.302315 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.322251 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.323012 4720 scope.go:117] "RemoveContainer" containerID="7f7e4c6d468593daf229306bc581366e165d475f87a2f54258d6febc1dbaad79" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.325863 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:49:50 crc kubenswrapper[4720]: E0121 14:49:50.326197 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerName="ceilometer-notification-agent" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.326213 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerName="ceilometer-notification-agent" Jan 21 14:49:50 crc kubenswrapper[4720]: E0121 14:49:50.326222 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerName="proxy-httpd" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.326229 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerName="proxy-httpd" Jan 21 14:49:50 crc kubenswrapper[4720]: E0121 14:49:50.326237 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerName="ceilometer-central-agent" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.326244 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerName="ceilometer-central-agent" Jan 21 14:49:50 crc kubenswrapper[4720]: E0121 14:49:50.326260 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerName="sg-core" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.326266 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerName="sg-core" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.326419 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerName="sg-core" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.326428 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerName="ceilometer-central-agent" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.326440 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerName="ceilometer-notification-agent" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.326452 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerName="proxy-httpd" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.328247 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.337543 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.341846 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.357042 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.404146 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dd7d19f-79e4-47c9-9934-cc003fe551db-log-httpd\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.404299 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.404382 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dd7d19f-79e4-47c9-9934-cc003fe551db-run-httpd\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.404447 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.404534 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ggb5\" (UniqueName: \"kubernetes.io/projected/5dd7d19f-79e4-47c9-9934-cc003fe551db-kube-api-access-7ggb5\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.404563 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-scripts\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.404647 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-config-data\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.506423 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.506760 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dd7d19f-79e4-47c9-9934-cc003fe551db-run-httpd\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.506874 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.506965 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ggb5\" (UniqueName: \"kubernetes.io/projected/5dd7d19f-79e4-47c9-9934-cc003fe551db-kube-api-access-7ggb5\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.507041 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-scripts\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.507149 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-config-data\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.507267 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dd7d19f-79e4-47c9-9934-cc003fe551db-log-httpd\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.507820 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dd7d19f-79e4-47c9-9934-cc003fe551db-log-httpd\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.509097 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dd7d19f-79e4-47c9-9934-cc003fe551db-run-httpd\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.512688 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.513513 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-scripts\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.513592 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.513844 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-config-data\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.527716 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ggb5\" (UniqueName: \"kubernetes.io/projected/5dd7d19f-79e4-47c9-9934-cc003fe551db-kube-api-access-7ggb5\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.665302 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.697222 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" path="/var/lib/kubelet/pods/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f/volumes" Jan 21 14:49:51 crc kubenswrapper[4720]: I0121 14:49:51.122033 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:49:51 crc kubenswrapper[4720]: I0121 14:49:51.133017 4720 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 14:49:51 crc kubenswrapper[4720]: I0121 14:49:51.246027 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dd7d19f-79e4-47c9-9934-cc003fe551db","Type":"ContainerStarted","Data":"cb3753944b96c83d03a1863c25acebd264308098d7a3d05463fb79438fec08af"} Jan 21 14:49:51 crc kubenswrapper[4720]: I0121 14:49:51.247132 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-vm954" event={"ID":"4dda8050-939a-4a64-b119-b718b60c7887","Type":"ContainerStarted","Data":"48dd6f7a9d23c8b16a78f67df238aa1196e7c893c560fa3cacb0f6b87e00728a"} Jan 21 14:49:51 crc kubenswrapper[4720]: I0121 14:49:51.278941 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-vm954" podStartSLOduration=2.729075439 podStartE2EDuration="15.278918645s" podCreationTimestamp="2026-01-21 14:49:36 +0000 UTC" firstStartedPulling="2026-01-21 14:49:37.548311389 +0000 UTC m=+1215.457051321" lastFinishedPulling="2026-01-21 14:49:50.098154595 +0000 UTC m=+1228.006894527" observedRunningTime="2026-01-21 14:49:51.271071481 +0000 UTC m=+1229.179811423" watchObservedRunningTime="2026-01-21 14:49:51.278918645 +0000 UTC m=+1229.187658587" Jan 21 14:49:52 crc kubenswrapper[4720]: I0121 14:49:52.880484 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:49:52 crc kubenswrapper[4720]: I0121 14:49:52.881064 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:49:53 crc kubenswrapper[4720]: I0121 14:49:53.262042 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dd7d19f-79e4-47c9-9934-cc003fe551db","Type":"ContainerStarted","Data":"09f3932991cc54223a102a084d56fb2a6013a3367824ff852a79aaab841c7c9d"} Jan 21 14:49:58 crc kubenswrapper[4720]: I0121 14:49:58.304670 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dd7d19f-79e4-47c9-9934-cc003fe551db","Type":"ContainerStarted","Data":"4a49a0860cc34aea77152e63d7f2664cc0101f7fb517d16de1561d7724f281fe"} Jan 21 14:49:58 crc kubenswrapper[4720]: I0121 14:49:58.305075 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dd7d19f-79e4-47c9-9934-cc003fe551db","Type":"ContainerStarted","Data":"2656b4700e21f0c4fe6d2a6022d5d04628debe20176c13e5a7ff671b4ef6cfd2"} Jan 21 14:50:00 crc kubenswrapper[4720]: I0121 14:50:00.322487 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dd7d19f-79e4-47c9-9934-cc003fe551db","Type":"ContainerStarted","Data":"0c9026115552582579bf8c91de9fceb499e94a991e4d85938cd66f4935bb22d8"} Jan 21 14:50:00 crc kubenswrapper[4720]: I0121 14:50:00.322747 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 21 14:50:00 crc kubenswrapper[4720]: I0121 14:50:00.354185 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.105467285 podStartE2EDuration="10.354168908s" podCreationTimestamp="2026-01-21 14:49:50 +0000 UTC" firstStartedPulling="2026-01-21 14:49:51.132831901 +0000 UTC m=+1229.041571833" lastFinishedPulling="2026-01-21 14:49:59.381533524 +0000 UTC m=+1237.290273456" observedRunningTime="2026-01-21 14:50:00.353401293 +0000 UTC m=+1238.262141225" watchObservedRunningTime="2026-01-21 14:50:00.354168908 +0000 UTC m=+1238.262908840" Jan 21 14:50:07 crc kubenswrapper[4720]: I0121 14:50:07.400021 4720 generic.go:334] "Generic (PLEG): container finished" podID="4dda8050-939a-4a64-b119-b718b60c7887" containerID="48dd6f7a9d23c8b16a78f67df238aa1196e7c893c560fa3cacb0f6b87e00728a" exitCode=0 Jan 21 14:50:07 crc kubenswrapper[4720]: I0121 14:50:07.400109 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-vm954" event={"ID":"4dda8050-939a-4a64-b119-b718b60c7887","Type":"ContainerDied","Data":"48dd6f7a9d23c8b16a78f67df238aa1196e7c893c560fa3cacb0f6b87e00728a"} Jan 21 14:50:08 crc kubenswrapper[4720]: I0121 14:50:08.721920 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-vm954" Jan 21 14:50:08 crc kubenswrapper[4720]: I0121 14:50:08.860025 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dda8050-939a-4a64-b119-b718b60c7887-config-data\") pod \"4dda8050-939a-4a64-b119-b718b60c7887\" (UID: \"4dda8050-939a-4a64-b119-b718b60c7887\") " Jan 21 14:50:08 crc kubenswrapper[4720]: I0121 14:50:08.860062 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59cmz\" (UniqueName: \"kubernetes.io/projected/4dda8050-939a-4a64-b119-b718b60c7887-kube-api-access-59cmz\") pod \"4dda8050-939a-4a64-b119-b718b60c7887\" (UID: \"4dda8050-939a-4a64-b119-b718b60c7887\") " Jan 21 14:50:08 crc kubenswrapper[4720]: I0121 14:50:08.860167 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dda8050-939a-4a64-b119-b718b60c7887-scripts\") pod \"4dda8050-939a-4a64-b119-b718b60c7887\" (UID: \"4dda8050-939a-4a64-b119-b718b60c7887\") " Jan 21 14:50:08 crc kubenswrapper[4720]: I0121 14:50:08.860248 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dda8050-939a-4a64-b119-b718b60c7887-combined-ca-bundle\") pod \"4dda8050-939a-4a64-b119-b718b60c7887\" (UID: \"4dda8050-939a-4a64-b119-b718b60c7887\") " Jan 21 14:50:08 crc kubenswrapper[4720]: I0121 14:50:08.884490 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dda8050-939a-4a64-b119-b718b60c7887-scripts" (OuterVolumeSpecName: "scripts") pod "4dda8050-939a-4a64-b119-b718b60c7887" (UID: "4dda8050-939a-4a64-b119-b718b60c7887"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:08 crc kubenswrapper[4720]: I0121 14:50:08.884797 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dda8050-939a-4a64-b119-b718b60c7887-kube-api-access-59cmz" (OuterVolumeSpecName: "kube-api-access-59cmz") pod "4dda8050-939a-4a64-b119-b718b60c7887" (UID: "4dda8050-939a-4a64-b119-b718b60c7887"). InnerVolumeSpecName "kube-api-access-59cmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:50:08 crc kubenswrapper[4720]: I0121 14:50:08.886038 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dda8050-939a-4a64-b119-b718b60c7887-config-data" (OuterVolumeSpecName: "config-data") pod "4dda8050-939a-4a64-b119-b718b60c7887" (UID: "4dda8050-939a-4a64-b119-b718b60c7887"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:08 crc kubenswrapper[4720]: I0121 14:50:08.890833 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dda8050-939a-4a64-b119-b718b60c7887-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4dda8050-939a-4a64-b119-b718b60c7887" (UID: "4dda8050-939a-4a64-b119-b718b60c7887"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:08 crc kubenswrapper[4720]: I0121 14:50:08.962008 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dda8050-939a-4a64-b119-b718b60c7887-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:08 crc kubenswrapper[4720]: I0121 14:50:08.962055 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dda8050-939a-4a64-b119-b718b60c7887-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:08 crc kubenswrapper[4720]: I0121 14:50:08.962069 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59cmz\" (UniqueName: \"kubernetes.io/projected/4dda8050-939a-4a64-b119-b718b60c7887-kube-api-access-59cmz\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:08 crc kubenswrapper[4720]: I0121 14:50:08.962082 4720 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dda8050-939a-4a64-b119-b718b60c7887-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:09 crc kubenswrapper[4720]: I0121 14:50:09.421311 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-vm954" Jan 21 14:50:09 crc kubenswrapper[4720]: I0121 14:50:09.421287 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-vm954" event={"ID":"4dda8050-939a-4a64-b119-b718b60c7887","Type":"ContainerDied","Data":"09ea94304001bcd335265fbc7a965a41b7b4d379c78f211cc4b029153d465e49"} Jan 21 14:50:09 crc kubenswrapper[4720]: I0121 14:50:09.421740 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09ea94304001bcd335265fbc7a965a41b7b4d379c78f211cc4b029153d465e49" Jan 21 14:50:09 crc kubenswrapper[4720]: I0121 14:50:09.560601 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 21 14:50:09 crc kubenswrapper[4720]: E0121 14:50:09.561022 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dda8050-939a-4a64-b119-b718b60c7887" containerName="nova-cell0-conductor-db-sync" Jan 21 14:50:09 crc kubenswrapper[4720]: I0121 14:50:09.561038 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dda8050-939a-4a64-b119-b718b60c7887" containerName="nova-cell0-conductor-db-sync" Jan 21 14:50:09 crc kubenswrapper[4720]: I0121 14:50:09.561272 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dda8050-939a-4a64-b119-b718b60c7887" containerName="nova-cell0-conductor-db-sync" Jan 21 14:50:09 crc kubenswrapper[4720]: I0121 14:50:09.561925 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 21 14:50:09 crc kubenswrapper[4720]: I0121 14:50:09.569427 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-pqrwq" Jan 21 14:50:09 crc kubenswrapper[4720]: I0121 14:50:09.569548 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 21 14:50:09 crc kubenswrapper[4720]: I0121 14:50:09.572469 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 21 14:50:09 crc kubenswrapper[4720]: I0121 14:50:09.676766 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/496cefe3-f97b-4d8c-9a25-4a6533d9e64c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"496cefe3-f97b-4d8c-9a25-4a6533d9e64c\") " pod="openstack/nova-cell0-conductor-0" Jan 21 14:50:09 crc kubenswrapper[4720]: I0121 14:50:09.676853 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/496cefe3-f97b-4d8c-9a25-4a6533d9e64c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"496cefe3-f97b-4d8c-9a25-4a6533d9e64c\") " pod="openstack/nova-cell0-conductor-0" Jan 21 14:50:09 crc kubenswrapper[4720]: I0121 14:50:09.676897 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcn2h\" (UniqueName: \"kubernetes.io/projected/496cefe3-f97b-4d8c-9a25-4a6533d9e64c-kube-api-access-lcn2h\") pod \"nova-cell0-conductor-0\" (UID: \"496cefe3-f97b-4d8c-9a25-4a6533d9e64c\") " pod="openstack/nova-cell0-conductor-0" Jan 21 14:50:09 crc kubenswrapper[4720]: I0121 14:50:09.778904 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/496cefe3-f97b-4d8c-9a25-4a6533d9e64c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"496cefe3-f97b-4d8c-9a25-4a6533d9e64c\") " pod="openstack/nova-cell0-conductor-0" Jan 21 14:50:09 crc kubenswrapper[4720]: I0121 14:50:09.778979 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcn2h\" (UniqueName: \"kubernetes.io/projected/496cefe3-f97b-4d8c-9a25-4a6533d9e64c-kube-api-access-lcn2h\") pod \"nova-cell0-conductor-0\" (UID: \"496cefe3-f97b-4d8c-9a25-4a6533d9e64c\") " pod="openstack/nova-cell0-conductor-0" Jan 21 14:50:09 crc kubenswrapper[4720]: I0121 14:50:09.779077 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/496cefe3-f97b-4d8c-9a25-4a6533d9e64c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"496cefe3-f97b-4d8c-9a25-4a6533d9e64c\") " pod="openstack/nova-cell0-conductor-0" Jan 21 14:50:09 crc kubenswrapper[4720]: I0121 14:50:09.782431 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/496cefe3-f97b-4d8c-9a25-4a6533d9e64c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"496cefe3-f97b-4d8c-9a25-4a6533d9e64c\") " pod="openstack/nova-cell0-conductor-0" Jan 21 14:50:09 crc kubenswrapper[4720]: I0121 14:50:09.782673 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/496cefe3-f97b-4d8c-9a25-4a6533d9e64c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"496cefe3-f97b-4d8c-9a25-4a6533d9e64c\") " pod="openstack/nova-cell0-conductor-0" Jan 21 14:50:09 crc kubenswrapper[4720]: I0121 14:50:09.798667 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcn2h\" (UniqueName: \"kubernetes.io/projected/496cefe3-f97b-4d8c-9a25-4a6533d9e64c-kube-api-access-lcn2h\") pod \"nova-cell0-conductor-0\" (UID: \"496cefe3-f97b-4d8c-9a25-4a6533d9e64c\") " pod="openstack/nova-cell0-conductor-0" Jan 21 14:50:09 crc kubenswrapper[4720]: I0121 14:50:09.879074 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 21 14:50:10 crc kubenswrapper[4720]: I0121 14:50:10.306555 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 21 14:50:10 crc kubenswrapper[4720]: I0121 14:50:10.431488 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"496cefe3-f97b-4d8c-9a25-4a6533d9e64c","Type":"ContainerStarted","Data":"ed3065c6ccdab478fef7ce2febafea92e13791b8f1eec5733513fca88ae4ab3e"} Jan 21 14:50:11 crc kubenswrapper[4720]: I0121 14:50:11.441555 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"496cefe3-f97b-4d8c-9a25-4a6533d9e64c","Type":"ContainerStarted","Data":"2a680d9aa6a57e0b34471343eed81fcf53b0f5f5c62294c89e943586c1975389"} Jan 21 14:50:11 crc kubenswrapper[4720]: I0121 14:50:11.442665 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 21 14:50:11 crc kubenswrapper[4720]: I0121 14:50:11.462065 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.462047108 podStartE2EDuration="2.462047108s" podCreationTimestamp="2026-01-21 14:50:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:50:11.454291095 +0000 UTC m=+1249.363031027" watchObservedRunningTime="2026-01-21 14:50:11.462047108 +0000 UTC m=+1249.370787030" Jan 21 14:50:19 crc kubenswrapper[4720]: I0121 14:50:19.905083 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.397780 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-jcm9t"] Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.398883 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-jcm9t" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.407147 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-jcm9t"] Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.409364 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.409579 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.454777 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b57a2637-15ee-4c59-881b-9364ffde9ffc-scripts\") pod \"nova-cell0-cell-mapping-jcm9t\" (UID: \"b57a2637-15ee-4c59-881b-9364ffde9ffc\") " pod="openstack/nova-cell0-cell-mapping-jcm9t" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.454852 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w4hg\" (UniqueName: \"kubernetes.io/projected/b57a2637-15ee-4c59-881b-9364ffde9ffc-kube-api-access-9w4hg\") pod \"nova-cell0-cell-mapping-jcm9t\" (UID: \"b57a2637-15ee-4c59-881b-9364ffde9ffc\") " pod="openstack/nova-cell0-cell-mapping-jcm9t" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.454888 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b57a2637-15ee-4c59-881b-9364ffde9ffc-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-jcm9t\" (UID: \"b57a2637-15ee-4c59-881b-9364ffde9ffc\") " pod="openstack/nova-cell0-cell-mapping-jcm9t" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.454922 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b57a2637-15ee-4c59-881b-9364ffde9ffc-config-data\") pod \"nova-cell0-cell-mapping-jcm9t\" (UID: \"b57a2637-15ee-4c59-881b-9364ffde9ffc\") " pod="openstack/nova-cell0-cell-mapping-jcm9t" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.556090 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b57a2637-15ee-4c59-881b-9364ffde9ffc-config-data\") pod \"nova-cell0-cell-mapping-jcm9t\" (UID: \"b57a2637-15ee-4c59-881b-9364ffde9ffc\") " pod="openstack/nova-cell0-cell-mapping-jcm9t" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.556225 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b57a2637-15ee-4c59-881b-9364ffde9ffc-scripts\") pod \"nova-cell0-cell-mapping-jcm9t\" (UID: \"b57a2637-15ee-4c59-881b-9364ffde9ffc\") " pod="openstack/nova-cell0-cell-mapping-jcm9t" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.556264 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9w4hg\" (UniqueName: \"kubernetes.io/projected/b57a2637-15ee-4c59-881b-9364ffde9ffc-kube-api-access-9w4hg\") pod \"nova-cell0-cell-mapping-jcm9t\" (UID: \"b57a2637-15ee-4c59-881b-9364ffde9ffc\") " pod="openstack/nova-cell0-cell-mapping-jcm9t" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.556294 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b57a2637-15ee-4c59-881b-9364ffde9ffc-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-jcm9t\" (UID: \"b57a2637-15ee-4c59-881b-9364ffde9ffc\") " pod="openstack/nova-cell0-cell-mapping-jcm9t" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.575512 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b57a2637-15ee-4c59-881b-9364ffde9ffc-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-jcm9t\" (UID: \"b57a2637-15ee-4c59-881b-9364ffde9ffc\") " pod="openstack/nova-cell0-cell-mapping-jcm9t" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.576723 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b57a2637-15ee-4c59-881b-9364ffde9ffc-config-data\") pod \"nova-cell0-cell-mapping-jcm9t\" (UID: \"b57a2637-15ee-4c59-881b-9364ffde9ffc\") " pod="openstack/nova-cell0-cell-mapping-jcm9t" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.586209 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b57a2637-15ee-4c59-881b-9364ffde9ffc-scripts\") pod \"nova-cell0-cell-mapping-jcm9t\" (UID: \"b57a2637-15ee-4c59-881b-9364ffde9ffc\") " pod="openstack/nova-cell0-cell-mapping-jcm9t" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.588180 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.589301 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.600131 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.611399 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.623621 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.625034 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9w4hg\" (UniqueName: \"kubernetes.io/projected/b57a2637-15ee-4c59-881b-9364ffde9ffc-kube-api-access-9w4hg\") pod \"nova-cell0-cell-mapping-jcm9t\" (UID: \"b57a2637-15ee-4c59-881b-9364ffde9ffc\") " pod="openstack/nova-cell0-cell-mapping-jcm9t" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.627091 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.633139 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.658665 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/513a3b4c-405a-4045-a76b-acf59f0cfd3a-config-data\") pod \"nova-api-0\" (UID: \"513a3b4c-405a-4045-a76b-acf59f0cfd3a\") " pod="openstack/nova-api-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.658732 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/513a3b4c-405a-4045-a76b-acf59f0cfd3a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"513a3b4c-405a-4045-a76b-acf59f0cfd3a\") " pod="openstack/nova-api-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.658786 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05605fa3-fac7-4375-8a3b-ff90d2664098-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"05605fa3-fac7-4375-8a3b-ff90d2664098\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.658835 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpb4l\" (UniqueName: \"kubernetes.io/projected/05605fa3-fac7-4375-8a3b-ff90d2664098-kube-api-access-tpb4l\") pod \"nova-cell1-novncproxy-0\" (UID: \"05605fa3-fac7-4375-8a3b-ff90d2664098\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.658860 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flf6v\" (UniqueName: \"kubernetes.io/projected/513a3b4c-405a-4045-a76b-acf59f0cfd3a-kube-api-access-flf6v\") pod \"nova-api-0\" (UID: \"513a3b4c-405a-4045-a76b-acf59f0cfd3a\") " pod="openstack/nova-api-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.658886 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/513a3b4c-405a-4045-a76b-acf59f0cfd3a-logs\") pod \"nova-api-0\" (UID: \"513a3b4c-405a-4045-a76b-acf59f0cfd3a\") " pod="openstack/nova-api-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.658912 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05605fa3-fac7-4375-8a3b-ff90d2664098-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"05605fa3-fac7-4375-8a3b-ff90d2664098\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.677320 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.708618 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.710152 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.714243 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.730179 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-jcm9t" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.748096 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.758191 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.781549 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtptw\" (UniqueName: \"kubernetes.io/projected/b4a03426-f037-45b9-8415-306cc3d2a735-kube-api-access-qtptw\") pod \"nova-metadata-0\" (UID: \"b4a03426-f037-45b9-8415-306cc3d2a735\") " pod="openstack/nova-metadata-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.781620 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/513a3b4c-405a-4045-a76b-acf59f0cfd3a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"513a3b4c-405a-4045-a76b-acf59f0cfd3a\") " pod="openstack/nova-api-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.781644 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4a03426-f037-45b9-8415-306cc3d2a735-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b4a03426-f037-45b9-8415-306cc3d2a735\") " pod="openstack/nova-metadata-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.781729 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05605fa3-fac7-4375-8a3b-ff90d2664098-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"05605fa3-fac7-4375-8a3b-ff90d2664098\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.781762 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4a03426-f037-45b9-8415-306cc3d2a735-logs\") pod \"nova-metadata-0\" (UID: \"b4a03426-f037-45b9-8415-306cc3d2a735\") " pod="openstack/nova-metadata-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.781796 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpb4l\" (UniqueName: \"kubernetes.io/projected/05605fa3-fac7-4375-8a3b-ff90d2664098-kube-api-access-tpb4l\") pod \"nova-cell1-novncproxy-0\" (UID: \"05605fa3-fac7-4375-8a3b-ff90d2664098\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.781824 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flf6v\" (UniqueName: \"kubernetes.io/projected/513a3b4c-405a-4045-a76b-acf59f0cfd3a-kube-api-access-flf6v\") pod \"nova-api-0\" (UID: \"513a3b4c-405a-4045-a76b-acf59f0cfd3a\") " pod="openstack/nova-api-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.781851 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/513a3b4c-405a-4045-a76b-acf59f0cfd3a-logs\") pod \"nova-api-0\" (UID: \"513a3b4c-405a-4045-a76b-acf59f0cfd3a\") " pod="openstack/nova-api-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.781884 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05605fa3-fac7-4375-8a3b-ff90d2664098-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"05605fa3-fac7-4375-8a3b-ff90d2664098\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.781938 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/513a3b4c-405a-4045-a76b-acf59f0cfd3a-config-data\") pod \"nova-api-0\" (UID: \"513a3b4c-405a-4045-a76b-acf59f0cfd3a\") " pod="openstack/nova-api-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.781964 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4a03426-f037-45b9-8415-306cc3d2a735-config-data\") pod \"nova-metadata-0\" (UID: \"b4a03426-f037-45b9-8415-306cc3d2a735\") " pod="openstack/nova-metadata-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.786749 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/513a3b4c-405a-4045-a76b-acf59f0cfd3a-logs\") pod \"nova-api-0\" (UID: \"513a3b4c-405a-4045-a76b-acf59f0cfd3a\") " pod="openstack/nova-api-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.795761 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/513a3b4c-405a-4045-a76b-acf59f0cfd3a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"513a3b4c-405a-4045-a76b-acf59f0cfd3a\") " pod="openstack/nova-api-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.795910 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05605fa3-fac7-4375-8a3b-ff90d2664098-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"05605fa3-fac7-4375-8a3b-ff90d2664098\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.798865 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/513a3b4c-405a-4045-a76b-acf59f0cfd3a-config-data\") pod \"nova-api-0\" (UID: \"513a3b4c-405a-4045-a76b-acf59f0cfd3a\") " pod="openstack/nova-api-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.801565 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05605fa3-fac7-4375-8a3b-ff90d2664098-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"05605fa3-fac7-4375-8a3b-ff90d2664098\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.843165 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flf6v\" (UniqueName: \"kubernetes.io/projected/513a3b4c-405a-4045-a76b-acf59f0cfd3a-kube-api-access-flf6v\") pod \"nova-api-0\" (UID: \"513a3b4c-405a-4045-a76b-acf59f0cfd3a\") " pod="openstack/nova-api-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.847269 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.848606 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.866273 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.874526 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpb4l\" (UniqueName: \"kubernetes.io/projected/05605fa3-fac7-4375-8a3b-ff90d2664098-kube-api-access-tpb4l\") pod \"nova-cell1-novncproxy-0\" (UID: \"05605fa3-fac7-4375-8a3b-ff90d2664098\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.886714 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.887792 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4a03426-f037-45b9-8415-306cc3d2a735-config-data\") pod \"nova-metadata-0\" (UID: \"b4a03426-f037-45b9-8415-306cc3d2a735\") " pod="openstack/nova-metadata-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.887897 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtptw\" (UniqueName: \"kubernetes.io/projected/b4a03426-f037-45b9-8415-306cc3d2a735-kube-api-access-qtptw\") pod \"nova-metadata-0\" (UID: \"b4a03426-f037-45b9-8415-306cc3d2a735\") " pod="openstack/nova-metadata-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.887932 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4a03426-f037-45b9-8415-306cc3d2a735-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b4a03426-f037-45b9-8415-306cc3d2a735\") " pod="openstack/nova-metadata-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.888018 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4a03426-f037-45b9-8415-306cc3d2a735-logs\") pod \"nova-metadata-0\" (UID: \"b4a03426-f037-45b9-8415-306cc3d2a735\") " pod="openstack/nova-metadata-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.888512 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4a03426-f037-45b9-8415-306cc3d2a735-logs\") pod \"nova-metadata-0\" (UID: \"b4a03426-f037-45b9-8415-306cc3d2a735\") " pod="openstack/nova-metadata-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.894259 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4a03426-f037-45b9-8415-306cc3d2a735-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b4a03426-f037-45b9-8415-306cc3d2a735\") " pod="openstack/nova-metadata-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.895322 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4a03426-f037-45b9-8415-306cc3d2a735-config-data\") pod \"nova-metadata-0\" (UID: \"b4a03426-f037-45b9-8415-306cc3d2a735\") " pod="openstack/nova-metadata-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.916263 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-mbq5w"] Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.917822 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.953275 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-mbq5w"] Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.953345 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtptw\" (UniqueName: \"kubernetes.io/projected/b4a03426-f037-45b9-8415-306cc3d2a735-kube-api-access-qtptw\") pod \"nova-metadata-0\" (UID: \"b4a03426-f037-45b9-8415-306cc3d2a735\") " pod="openstack/nova-metadata-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.989629 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/684b734f-b6c9-47c9-a8e6-696eb7b0e5d6-config-data\") pod \"nova-scheduler-0\" (UID: \"684b734f-b6c9-47c9-a8e6-696eb7b0e5d6\") " pod="openstack/nova-scheduler-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.989684 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-dns-svc\") pod \"dnsmasq-dns-566b5b7845-mbq5w\" (UID: \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\") " pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.989711 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/684b734f-b6c9-47c9-a8e6-696eb7b0e5d6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"684b734f-b6c9-47c9-a8e6-696eb7b0e5d6\") " pod="openstack/nova-scheduler-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.989734 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-mbq5w\" (UID: \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\") " pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.989760 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwpkl\" (UniqueName: \"kubernetes.io/projected/684b734f-b6c9-47c9-a8e6-696eb7b0e5d6-kube-api-access-fwpkl\") pod \"nova-scheduler-0\" (UID: \"684b734f-b6c9-47c9-a8e6-696eb7b0e5d6\") " pod="openstack/nova-scheduler-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.990269 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-mbq5w\" (UID: \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\") " pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.990313 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-config\") pod \"dnsmasq-dns-566b5b7845-mbq5w\" (UID: \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\") " pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.990337 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bnkz\" (UniqueName: \"kubernetes.io/projected/0f51fb54-b6cb-4a03-b378-714f549cd2a1-kube-api-access-7bnkz\") pod \"dnsmasq-dns-566b5b7845-mbq5w\" (UID: \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\") " pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.003261 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.014782 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.023810 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.091965 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/684b734f-b6c9-47c9-a8e6-696eb7b0e5d6-config-data\") pod \"nova-scheduler-0\" (UID: \"684b734f-b6c9-47c9-a8e6-696eb7b0e5d6\") " pod="openstack/nova-scheduler-0" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.092053 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-dns-svc\") pod \"dnsmasq-dns-566b5b7845-mbq5w\" (UID: \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\") " pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.092093 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/684b734f-b6c9-47c9-a8e6-696eb7b0e5d6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"684b734f-b6c9-47c9-a8e6-696eb7b0e5d6\") " pod="openstack/nova-scheduler-0" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.092124 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-mbq5w\" (UID: \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\") " pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.092154 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwpkl\" (UniqueName: \"kubernetes.io/projected/684b734f-b6c9-47c9-a8e6-696eb7b0e5d6-kube-api-access-fwpkl\") pod \"nova-scheduler-0\" (UID: \"684b734f-b6c9-47c9-a8e6-696eb7b0e5d6\") " pod="openstack/nova-scheduler-0" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.092236 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-mbq5w\" (UID: \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\") " pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.092277 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-config\") pod \"dnsmasq-dns-566b5b7845-mbq5w\" (UID: \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\") " pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.092354 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bnkz\" (UniqueName: \"kubernetes.io/projected/0f51fb54-b6cb-4a03-b378-714f549cd2a1-kube-api-access-7bnkz\") pod \"dnsmasq-dns-566b5b7845-mbq5w\" (UID: \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\") " pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.095715 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-mbq5w\" (UID: \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\") " pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.100546 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/684b734f-b6c9-47c9-a8e6-696eb7b0e5d6-config-data\") pod \"nova-scheduler-0\" (UID: \"684b734f-b6c9-47c9-a8e6-696eb7b0e5d6\") " pod="openstack/nova-scheduler-0" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.102084 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-dns-svc\") pod \"dnsmasq-dns-566b5b7845-mbq5w\" (UID: \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\") " pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.105202 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-config\") pod \"dnsmasq-dns-566b5b7845-mbq5w\" (UID: \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\") " pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.109462 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-mbq5w\" (UID: \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\") " pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.111308 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/684b734f-b6c9-47c9-a8e6-696eb7b0e5d6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"684b734f-b6c9-47c9-a8e6-696eb7b0e5d6\") " pod="openstack/nova-scheduler-0" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.114518 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwpkl\" (UniqueName: \"kubernetes.io/projected/684b734f-b6c9-47c9-a8e6-696eb7b0e5d6-kube-api-access-fwpkl\") pod \"nova-scheduler-0\" (UID: \"684b734f-b6c9-47c9-a8e6-696eb7b0e5d6\") " pod="openstack/nova-scheduler-0" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.114957 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bnkz\" (UniqueName: \"kubernetes.io/projected/0f51fb54-b6cb-4a03-b378-714f549cd2a1-kube-api-access-7bnkz\") pod \"dnsmasq-dns-566b5b7845-mbq5w\" (UID: \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\") " pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.175802 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.238642 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.478029 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-jcm9t"] Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.569318 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-jcm9t" event={"ID":"b57a2637-15ee-4c59-881b-9364ffde9ffc","Type":"ContainerStarted","Data":"fd968c05b8bb02b11dbdb28cde7af3788cf08c2b3fd8fe15c592bc03162e9a7c"} Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.652232 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.795092 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.924007 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.962483 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.038169 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wwmwq"] Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.039152 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wwmwq" Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.041976 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.043647 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.058897 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wwmwq"] Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.137026 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62smw\" (UniqueName: \"kubernetes.io/projected/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-kube-api-access-62smw\") pod \"nova-cell1-conductor-db-sync-wwmwq\" (UID: \"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7\") " pod="openstack/nova-cell1-conductor-db-sync-wwmwq" Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.137090 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-config-data\") pod \"nova-cell1-conductor-db-sync-wwmwq\" (UID: \"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7\") " pod="openstack/nova-cell1-conductor-db-sync-wwmwq" Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.137148 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-scripts\") pod \"nova-cell1-conductor-db-sync-wwmwq\" (UID: \"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7\") " pod="openstack/nova-cell1-conductor-db-sync-wwmwq" Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.137200 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wwmwq\" (UID: \"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7\") " pod="openstack/nova-cell1-conductor-db-sync-wwmwq" Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.147079 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-mbq5w"] Jan 21 14:50:22 crc kubenswrapper[4720]: W0121 14:50:22.150815 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f51fb54_b6cb_4a03_b378_714f549cd2a1.slice/crio-380b121eb00fcd604ad13adc40e1a9168307fc04dc188009c76758ceb903fd8f WatchSource:0}: Error finding container 380b121eb00fcd604ad13adc40e1a9168307fc04dc188009c76758ceb903fd8f: Status 404 returned error can't find the container with id 380b121eb00fcd604ad13adc40e1a9168307fc04dc188009c76758ceb903fd8f Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.243084 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-scripts\") pod \"nova-cell1-conductor-db-sync-wwmwq\" (UID: \"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7\") " pod="openstack/nova-cell1-conductor-db-sync-wwmwq" Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.243581 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wwmwq\" (UID: \"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7\") " pod="openstack/nova-cell1-conductor-db-sync-wwmwq" Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.243707 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62smw\" (UniqueName: \"kubernetes.io/projected/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-kube-api-access-62smw\") pod \"nova-cell1-conductor-db-sync-wwmwq\" (UID: \"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7\") " pod="openstack/nova-cell1-conductor-db-sync-wwmwq" Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.243777 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-config-data\") pod \"nova-cell1-conductor-db-sync-wwmwq\" (UID: \"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7\") " pod="openstack/nova-cell1-conductor-db-sync-wwmwq" Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.252445 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wwmwq\" (UID: \"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7\") " pod="openstack/nova-cell1-conductor-db-sync-wwmwq" Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.255231 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-config-data\") pod \"nova-cell1-conductor-db-sync-wwmwq\" (UID: \"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7\") " pod="openstack/nova-cell1-conductor-db-sync-wwmwq" Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.265417 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-scripts\") pod \"nova-cell1-conductor-db-sync-wwmwq\" (UID: \"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7\") " pod="openstack/nova-cell1-conductor-db-sync-wwmwq" Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.268742 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62smw\" (UniqueName: \"kubernetes.io/projected/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-kube-api-access-62smw\") pod \"nova-cell1-conductor-db-sync-wwmwq\" (UID: \"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7\") " pod="openstack/nova-cell1-conductor-db-sync-wwmwq" Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.412043 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wwmwq" Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.585587 4720 generic.go:334] "Generic (PLEG): container finished" podID="0f51fb54-b6cb-4a03-b378-714f549cd2a1" containerID="c53b9b942e3700ab88cecf03857239f8c69e629fa546f404751ee79be8529e6b" exitCode=0 Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.585693 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" event={"ID":"0f51fb54-b6cb-4a03-b378-714f549cd2a1","Type":"ContainerDied","Data":"c53b9b942e3700ab88cecf03857239f8c69e629fa546f404751ee79be8529e6b"} Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.585718 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" event={"ID":"0f51fb54-b6cb-4a03-b378-714f549cd2a1","Type":"ContainerStarted","Data":"380b121eb00fcd604ad13adc40e1a9168307fc04dc188009c76758ceb903fd8f"} Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.589376 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"684b734f-b6c9-47c9-a8e6-696eb7b0e5d6","Type":"ContainerStarted","Data":"0cb0e309b39bce610a72ded1405c14599f22a6c20641e56fd95873ffd0658fca"} Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.593780 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-jcm9t" event={"ID":"b57a2637-15ee-4c59-881b-9364ffde9ffc","Type":"ContainerStarted","Data":"1cacd08b92a88ab371232f39ef9e5865d3573d5d8458ae4746910cd77bac3530"} Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.596120 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"513a3b4c-405a-4045-a76b-acf59f0cfd3a","Type":"ContainerStarted","Data":"8d9596c402475afa19cd2db87682ff4b62a90d083e56dc241ca3ef4472369439"} Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.600415 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"05605fa3-fac7-4375-8a3b-ff90d2664098","Type":"ContainerStarted","Data":"288c85dff47a79ec2a1b499393b40b7854d1dcf0eb1a7514afc8487559facb57"} Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.607607 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b4a03426-f037-45b9-8415-306cc3d2a735","Type":"ContainerStarted","Data":"3f3474f70de2d6017a446624474e6e64bbd65c26a6c2176f0176715419053ba3"} Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.649535 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-jcm9t" podStartSLOduration=2.649510535 podStartE2EDuration="2.649510535s" podCreationTimestamp="2026-01-21 14:50:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:50:22.633208603 +0000 UTC m=+1260.541948545" watchObservedRunningTime="2026-01-21 14:50:22.649510535 +0000 UTC m=+1260.558250467" Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.880112 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.880157 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.881722 4720 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.882321 4720 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c955510d9d72215d99901afe6e11ff00ee6cb8f0d5290256bae37e29e3631aa6"} pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.882366 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" containerID="cri-o://c955510d9d72215d99901afe6e11ff00ee6cb8f0d5290256bae37e29e3631aa6" gracePeriod=600 Jan 21 14:50:23 crc kubenswrapper[4720]: I0121 14:50:23.047138 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wwmwq"] Jan 21 14:50:23 crc kubenswrapper[4720]: I0121 14:50:23.627202 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wwmwq" event={"ID":"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7","Type":"ContainerStarted","Data":"2fb036b7bf2aa15771341a929f076f669e5c637e7af2b17fb8de16677e0b5e80"} Jan 21 14:50:23 crc kubenswrapper[4720]: E0121 14:50:23.834840 4720 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1128ddd_06c2_4255_aa17_b62aa0f8a996.slice/crio-conmon-c955510d9d72215d99901afe6e11ff00ee6cb8f0d5290256bae37e29e3631aa6.scope\": RecentStats: unable to find data in memory cache]" Jan 21 14:50:24 crc kubenswrapper[4720]: I0121 14:50:24.637493 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" event={"ID":"0f51fb54-b6cb-4a03-b378-714f549cd2a1","Type":"ContainerStarted","Data":"ec586fa0a9aadd81cd43534105b8c70d16abdfd751fc33f2fa3e2b01263a5d1d"} Jan 21 14:50:24 crc kubenswrapper[4720]: I0121 14:50:24.639240 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:50:24 crc kubenswrapper[4720]: I0121 14:50:24.651863 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wwmwq" event={"ID":"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7","Type":"ContainerStarted","Data":"0cf3fdd52f65dc4830c6503325bd2251a454cdee26406e44f66cb14c6ec26e1c"} Jan 21 14:50:24 crc kubenswrapper[4720]: I0121 14:50:24.657127 4720 generic.go:334] "Generic (PLEG): container finished" podID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerID="c955510d9d72215d99901afe6e11ff00ee6cb8f0d5290256bae37e29e3631aa6" exitCode=0 Jan 21 14:50:24 crc kubenswrapper[4720]: I0121 14:50:24.657185 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerDied","Data":"c955510d9d72215d99901afe6e11ff00ee6cb8f0d5290256bae37e29e3631aa6"} Jan 21 14:50:24 crc kubenswrapper[4720]: I0121 14:50:24.657246 4720 scope.go:117] "RemoveContainer" containerID="533cdaf61eeca84a9c75ff12c4bc63c6833cac28437ed5151fede2f9b5a4f6a6" Jan 21 14:50:24 crc kubenswrapper[4720]: I0121 14:50:24.668362 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" podStartSLOduration=4.668344846 podStartE2EDuration="4.668344846s" podCreationTimestamp="2026-01-21 14:50:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:50:24.664967604 +0000 UTC m=+1262.573707546" watchObservedRunningTime="2026-01-21 14:50:24.668344846 +0000 UTC m=+1262.577084778" Jan 21 14:50:24 crc kubenswrapper[4720]: I0121 14:50:24.699833 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-wwmwq" podStartSLOduration=2.699815091 podStartE2EDuration="2.699815091s" podCreationTimestamp="2026-01-21 14:50:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:50:24.699323021 +0000 UTC m=+1262.608062953" watchObservedRunningTime="2026-01-21 14:50:24.699815091 +0000 UTC m=+1262.608555023" Jan 21 14:50:24 crc kubenswrapper[4720]: I0121 14:50:24.723844 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:50:24 crc kubenswrapper[4720]: I0121 14:50:24.790195 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 21 14:50:27 crc kubenswrapper[4720]: I0121 14:50:27.584707 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 21 14:50:27 crc kubenswrapper[4720]: I0121 14:50:27.586703 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7" containerName="kube-state-metrics" containerID="cri-o://ebd6d3199ec576dfbbe1f0b22d6431e8b4bb04c2a4b40f4a23c829c2a972ee92" gracePeriod=30 Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.614324 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.703821 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcq8h\" (UniqueName: \"kubernetes.io/projected/ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7-kube-api-access-mcq8h\") pod \"ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7\" (UID: \"ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7\") " Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.721031 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7-kube-api-access-mcq8h" (OuterVolumeSpecName: "kube-api-access-mcq8h") pod "ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7" (UID: "ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7"). InnerVolumeSpecName "kube-api-access-mcq8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.756207 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerStarted","Data":"4a11e96f70bc2887e543718b48f5cffe20ea9e02702421d54bac9042ee7fd65f"} Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.759481 4720 generic.go:334] "Generic (PLEG): container finished" podID="ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7" containerID="ebd6d3199ec576dfbbe1f0b22d6431e8b4bb04c2a4b40f4a23c829c2a972ee92" exitCode=2 Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.759519 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7","Type":"ContainerDied","Data":"ebd6d3199ec576dfbbe1f0b22d6431e8b4bb04c2a4b40f4a23c829c2a972ee92"} Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.759541 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7","Type":"ContainerDied","Data":"b5de03c99a86e921243af3619119b73c952c5f3ccc688bb6fd4a69b6fda32dd9"} Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.759555 4720 scope.go:117] "RemoveContainer" containerID="ebd6d3199ec576dfbbe1f0b22d6431e8b4bb04c2a4b40f4a23c829c2a972ee92" Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.759645 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.800381 4720 scope.go:117] "RemoveContainer" containerID="ebd6d3199ec576dfbbe1f0b22d6431e8b4bb04c2a4b40f4a23c829c2a972ee92" Jan 21 14:50:28 crc kubenswrapper[4720]: E0121 14:50:28.803116 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebd6d3199ec576dfbbe1f0b22d6431e8b4bb04c2a4b40f4a23c829c2a972ee92\": container with ID starting with ebd6d3199ec576dfbbe1f0b22d6431e8b4bb04c2a4b40f4a23c829c2a972ee92 not found: ID does not exist" containerID="ebd6d3199ec576dfbbe1f0b22d6431e8b4bb04c2a4b40f4a23c829c2a972ee92" Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.803153 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebd6d3199ec576dfbbe1f0b22d6431e8b4bb04c2a4b40f4a23c829c2a972ee92"} err="failed to get container status \"ebd6d3199ec576dfbbe1f0b22d6431e8b4bb04c2a4b40f4a23c829c2a972ee92\": rpc error: code = NotFound desc = could not find container \"ebd6d3199ec576dfbbe1f0b22d6431e8b4bb04c2a4b40f4a23c829c2a972ee92\": container with ID starting with ebd6d3199ec576dfbbe1f0b22d6431e8b4bb04c2a4b40f4a23c829c2a972ee92 not found: ID does not exist" Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.805893 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcq8h\" (UniqueName: \"kubernetes.io/projected/ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7-kube-api-access-mcq8h\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.845013 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.864215 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.872143 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 21 14:50:28 crc kubenswrapper[4720]: E0121 14:50:28.872463 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7" containerName="kube-state-metrics" Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.872480 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7" containerName="kube-state-metrics" Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.872644 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7" containerName="kube-state-metrics" Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.873250 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.880685 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.898871 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.899108 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.010185 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/60d4c6e3-4a01-421e-aad1-1972ed16e528-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"60d4c6e3-4a01-421e-aad1-1972ed16e528\") " pod="openstack/kube-state-metrics-0" Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.010471 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zps8n\" (UniqueName: \"kubernetes.io/projected/60d4c6e3-4a01-421e-aad1-1972ed16e528-kube-api-access-zps8n\") pod \"kube-state-metrics-0\" (UID: \"60d4c6e3-4a01-421e-aad1-1972ed16e528\") " pod="openstack/kube-state-metrics-0" Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.010575 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60d4c6e3-4a01-421e-aad1-1972ed16e528-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"60d4c6e3-4a01-421e-aad1-1972ed16e528\") " pod="openstack/kube-state-metrics-0" Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.010604 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/60d4c6e3-4a01-421e-aad1-1972ed16e528-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"60d4c6e3-4a01-421e-aad1-1972ed16e528\") " pod="openstack/kube-state-metrics-0" Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.066436 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.066937 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5dd7d19f-79e4-47c9-9934-cc003fe551db" containerName="ceilometer-central-agent" containerID="cri-o://09f3932991cc54223a102a084d56fb2a6013a3367824ff852a79aaab841c7c9d" gracePeriod=30 Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.067006 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5dd7d19f-79e4-47c9-9934-cc003fe551db" containerName="sg-core" containerID="cri-o://4a49a0860cc34aea77152e63d7f2664cc0101f7fb517d16de1561d7724f281fe" gracePeriod=30 Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.067053 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5dd7d19f-79e4-47c9-9934-cc003fe551db" containerName="ceilometer-notification-agent" containerID="cri-o://2656b4700e21f0c4fe6d2a6022d5d04628debe20176c13e5a7ff671b4ef6cfd2" gracePeriod=30 Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.068033 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5dd7d19f-79e4-47c9-9934-cc003fe551db" containerName="proxy-httpd" containerID="cri-o://0c9026115552582579bf8c91de9fceb499e94a991e4d85938cd66f4935bb22d8" gracePeriod=30 Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.111887 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60d4c6e3-4a01-421e-aad1-1972ed16e528-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"60d4c6e3-4a01-421e-aad1-1972ed16e528\") " pod="openstack/kube-state-metrics-0" Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.112153 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/60d4c6e3-4a01-421e-aad1-1972ed16e528-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"60d4c6e3-4a01-421e-aad1-1972ed16e528\") " pod="openstack/kube-state-metrics-0" Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.112756 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/60d4c6e3-4a01-421e-aad1-1972ed16e528-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"60d4c6e3-4a01-421e-aad1-1972ed16e528\") " pod="openstack/kube-state-metrics-0" Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.113026 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zps8n\" (UniqueName: \"kubernetes.io/projected/60d4c6e3-4a01-421e-aad1-1972ed16e528-kube-api-access-zps8n\") pod \"kube-state-metrics-0\" (UID: \"60d4c6e3-4a01-421e-aad1-1972ed16e528\") " pod="openstack/kube-state-metrics-0" Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.118626 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60d4c6e3-4a01-421e-aad1-1972ed16e528-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"60d4c6e3-4a01-421e-aad1-1972ed16e528\") " pod="openstack/kube-state-metrics-0" Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.119116 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/60d4c6e3-4a01-421e-aad1-1972ed16e528-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"60d4c6e3-4a01-421e-aad1-1972ed16e528\") " pod="openstack/kube-state-metrics-0" Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.119933 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/60d4c6e3-4a01-421e-aad1-1972ed16e528-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"60d4c6e3-4a01-421e-aad1-1972ed16e528\") " pod="openstack/kube-state-metrics-0" Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.133758 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zps8n\" (UniqueName: \"kubernetes.io/projected/60d4c6e3-4a01-421e-aad1-1972ed16e528-kube-api-access-zps8n\") pod \"kube-state-metrics-0\" (UID: \"60d4c6e3-4a01-421e-aad1-1972ed16e528\") " pod="openstack/kube-state-metrics-0" Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.212560 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.770937 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"684b734f-b6c9-47c9-a8e6-696eb7b0e5d6","Type":"ContainerStarted","Data":"aa4eba24f81a8563f110a15ad5721600a8508111b6e91a07d196d385d0352fe7"} Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.774988 4720 generic.go:334] "Generic (PLEG): container finished" podID="5dd7d19f-79e4-47c9-9934-cc003fe551db" containerID="0c9026115552582579bf8c91de9fceb499e94a991e4d85938cd66f4935bb22d8" exitCode=0 Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.775018 4720 generic.go:334] "Generic (PLEG): container finished" podID="5dd7d19f-79e4-47c9-9934-cc003fe551db" containerID="4a49a0860cc34aea77152e63d7f2664cc0101f7fb517d16de1561d7724f281fe" exitCode=2 Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.775075 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dd7d19f-79e4-47c9-9934-cc003fe551db","Type":"ContainerDied","Data":"0c9026115552582579bf8c91de9fceb499e94a991e4d85938cd66f4935bb22d8"} Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.775151 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dd7d19f-79e4-47c9-9934-cc003fe551db","Type":"ContainerDied","Data":"4a49a0860cc34aea77152e63d7f2664cc0101f7fb517d16de1561d7724f281fe"} Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.776753 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"513a3b4c-405a-4045-a76b-acf59f0cfd3a","Type":"ContainerStarted","Data":"1a7aba18286d833bce25e8e85b8d4351c6d249b6433918eb9ba77d5c982b9b45"} Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.776781 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"513a3b4c-405a-4045-a76b-acf59f0cfd3a","Type":"ContainerStarted","Data":"3e9948de7472ed8920759f420aefe5ec7761ed425b16da329167d68730b928c1"} Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.777775 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"05605fa3-fac7-4375-8a3b-ff90d2664098","Type":"ContainerStarted","Data":"43256d114f7b72d2ce26d562115a7a1fc28bb5530d0b1203ac1a95fee0c62437"} Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.777902 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="05605fa3-fac7-4375-8a3b-ff90d2664098" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://43256d114f7b72d2ce26d562115a7a1fc28bb5530d0b1203ac1a95fee0c62437" gracePeriod=30 Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.788889 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b4a03426-f037-45b9-8415-306cc3d2a735","Type":"ContainerStarted","Data":"2179f70c0f1ed27cb4ffbaf9ad12b4230a9db29471ffdbbbcda8014c22b42615"} Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.788940 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b4a03426-f037-45b9-8415-306cc3d2a735","Type":"ContainerStarted","Data":"625aa9c2af709ec13f1a7d17354a7763d16e4128f3e89e12c9209faa9f02898b"} Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.789248 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b4a03426-f037-45b9-8415-306cc3d2a735" containerName="nova-metadata-log" containerID="cri-o://625aa9c2af709ec13f1a7d17354a7763d16e4128f3e89e12c9209faa9f02898b" gracePeriod=30 Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.789409 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b4a03426-f037-45b9-8415-306cc3d2a735" containerName="nova-metadata-metadata" containerID="cri-o://2179f70c0f1ed27cb4ffbaf9ad12b4230a9db29471ffdbbbcda8014c22b42615" gracePeriod=30 Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.800510 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.397837825 podStartE2EDuration="9.800494103s" podCreationTimestamp="2026-01-21 14:50:20 +0000 UTC" firstStartedPulling="2026-01-21 14:50:21.932876714 +0000 UTC m=+1259.841616646" lastFinishedPulling="2026-01-21 14:50:28.335532992 +0000 UTC m=+1266.244272924" observedRunningTime="2026-01-21 14:50:29.799982513 +0000 UTC m=+1267.708722455" watchObservedRunningTime="2026-01-21 14:50:29.800494103 +0000 UTC m=+1267.709234035" Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.828497 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.172110635 podStartE2EDuration="9.8283568s" podCreationTimestamp="2026-01-21 14:50:20 +0000 UTC" firstStartedPulling="2026-01-21 14:50:21.673946988 +0000 UTC m=+1259.582686920" lastFinishedPulling="2026-01-21 14:50:28.330193153 +0000 UTC m=+1266.238933085" observedRunningTime="2026-01-21 14:50:29.820200369 +0000 UTC m=+1267.728940311" watchObservedRunningTime="2026-01-21 14:50:29.8283568 +0000 UTC m=+1267.737096732" Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.874736 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.880284 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.492479152 podStartE2EDuration="9.880260004s" podCreationTimestamp="2026-01-21 14:50:20 +0000 UTC" firstStartedPulling="2026-01-21 14:50:21.961562346 +0000 UTC m=+1259.870302278" lastFinishedPulling="2026-01-21 14:50:28.349343198 +0000 UTC m=+1266.258083130" observedRunningTime="2026-01-21 14:50:29.848109797 +0000 UTC m=+1267.756849729" watchObservedRunningTime="2026-01-21 14:50:29.880260004 +0000 UTC m=+1267.788999936" Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.897753 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.360201027 podStartE2EDuration="9.897734568s" podCreationTimestamp="2026-01-21 14:50:20 +0000 UTC" firstStartedPulling="2026-01-21 14:50:21.801460645 +0000 UTC m=+1259.710200577" lastFinishedPulling="2026-01-21 14:50:28.338994186 +0000 UTC m=+1266.247734118" observedRunningTime="2026-01-21 14:50:29.891154626 +0000 UTC m=+1267.799894558" watchObservedRunningTime="2026-01-21 14:50:29.897734568 +0000 UTC m=+1267.806474500" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.506468 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.592961 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4a03426-f037-45b9-8415-306cc3d2a735-logs\") pod \"b4a03426-f037-45b9-8415-306cc3d2a735\" (UID: \"b4a03426-f037-45b9-8415-306cc3d2a735\") " Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.593023 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4a03426-f037-45b9-8415-306cc3d2a735-combined-ca-bundle\") pod \"b4a03426-f037-45b9-8415-306cc3d2a735\" (UID: \"b4a03426-f037-45b9-8415-306cc3d2a735\") " Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.593087 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtptw\" (UniqueName: \"kubernetes.io/projected/b4a03426-f037-45b9-8415-306cc3d2a735-kube-api-access-qtptw\") pod \"b4a03426-f037-45b9-8415-306cc3d2a735\" (UID: \"b4a03426-f037-45b9-8415-306cc3d2a735\") " Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.593162 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4a03426-f037-45b9-8415-306cc3d2a735-config-data\") pod \"b4a03426-f037-45b9-8415-306cc3d2a735\" (UID: \"b4a03426-f037-45b9-8415-306cc3d2a735\") " Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.593288 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4a03426-f037-45b9-8415-306cc3d2a735-logs" (OuterVolumeSpecName: "logs") pod "b4a03426-f037-45b9-8415-306cc3d2a735" (UID: "b4a03426-f037-45b9-8415-306cc3d2a735"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.593555 4720 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4a03426-f037-45b9-8415-306cc3d2a735-logs\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.608816 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4a03426-f037-45b9-8415-306cc3d2a735-kube-api-access-qtptw" (OuterVolumeSpecName: "kube-api-access-qtptw") pod "b4a03426-f037-45b9-8415-306cc3d2a735" (UID: "b4a03426-f037-45b9-8415-306cc3d2a735"). InnerVolumeSpecName "kube-api-access-qtptw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.625192 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4a03426-f037-45b9-8415-306cc3d2a735-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4a03426-f037-45b9-8415-306cc3d2a735" (UID: "b4a03426-f037-45b9-8415-306cc3d2a735"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.635441 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4a03426-f037-45b9-8415-306cc3d2a735-config-data" (OuterVolumeSpecName: "config-data") pod "b4a03426-f037-45b9-8415-306cc3d2a735" (UID: "b4a03426-f037-45b9-8415-306cc3d2a735"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.688017 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7" path="/var/lib/kubelet/pods/ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7/volumes" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.694875 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4a03426-f037-45b9-8415-306cc3d2a735-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.694907 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4a03426-f037-45b9-8415-306cc3d2a735-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.694920 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtptw\" (UniqueName: \"kubernetes.io/projected/b4a03426-f037-45b9-8415-306cc3d2a735-kube-api-access-qtptw\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.799256 4720 generic.go:334] "Generic (PLEG): container finished" podID="5dd7d19f-79e4-47c9-9934-cc003fe551db" containerID="09f3932991cc54223a102a084d56fb2a6013a3367824ff852a79aaab841c7c9d" exitCode=0 Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.799317 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dd7d19f-79e4-47c9-9934-cc003fe551db","Type":"ContainerDied","Data":"09f3932991cc54223a102a084d56fb2a6013a3367824ff852a79aaab841c7c9d"} Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.801893 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"60d4c6e3-4a01-421e-aad1-1972ed16e528","Type":"ContainerStarted","Data":"b02a2e2c9834b0d63d9f335f9a3cc75d801cb2b6e47c24b7c8e1f9e825ba5396"} Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.802058 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"60d4c6e3-4a01-421e-aad1-1972ed16e528","Type":"ContainerStarted","Data":"ec0fe64aa799c048205ddaaddeeacf190263a67aeec33a00b9e7c524c15f979a"} Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.802420 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.804278 4720 generic.go:334] "Generic (PLEG): container finished" podID="b4a03426-f037-45b9-8415-306cc3d2a735" containerID="2179f70c0f1ed27cb4ffbaf9ad12b4230a9db29471ffdbbbcda8014c22b42615" exitCode=0 Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.804309 4720 generic.go:334] "Generic (PLEG): container finished" podID="b4a03426-f037-45b9-8415-306cc3d2a735" containerID="625aa9c2af709ec13f1a7d17354a7763d16e4128f3e89e12c9209faa9f02898b" exitCode=143 Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.804333 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b4a03426-f037-45b9-8415-306cc3d2a735","Type":"ContainerDied","Data":"2179f70c0f1ed27cb4ffbaf9ad12b4230a9db29471ffdbbbcda8014c22b42615"} Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.804358 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b4a03426-f037-45b9-8415-306cc3d2a735","Type":"ContainerDied","Data":"625aa9c2af709ec13f1a7d17354a7763d16e4128f3e89e12c9209faa9f02898b"} Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.804379 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b4a03426-f037-45b9-8415-306cc3d2a735","Type":"ContainerDied","Data":"3f3474f70de2d6017a446624474e6e64bbd65c26a6c2176f0176715419053ba3"} Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.804387 4720 scope.go:117] "RemoveContainer" containerID="2179f70c0f1ed27cb4ffbaf9ad12b4230a9db29471ffdbbbcda8014c22b42615" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.804568 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.854044 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.471599029 podStartE2EDuration="2.853802163s" podCreationTimestamp="2026-01-21 14:50:28 +0000 UTC" firstStartedPulling="2026-01-21 14:50:29.857592033 +0000 UTC m=+1267.766331965" lastFinishedPulling="2026-01-21 14:50:30.239795157 +0000 UTC m=+1268.148535099" observedRunningTime="2026-01-21 14:50:30.848866492 +0000 UTC m=+1268.757606424" watchObservedRunningTime="2026-01-21 14:50:30.853802163 +0000 UTC m=+1268.762542115" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.890340 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.896086 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.906216 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:50:30 crc kubenswrapper[4720]: E0121 14:50:30.906548 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4a03426-f037-45b9-8415-306cc3d2a735" containerName="nova-metadata-log" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.906563 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4a03426-f037-45b9-8415-306cc3d2a735" containerName="nova-metadata-log" Jan 21 14:50:30 crc kubenswrapper[4720]: E0121 14:50:30.906625 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4a03426-f037-45b9-8415-306cc3d2a735" containerName="nova-metadata-metadata" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.906633 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4a03426-f037-45b9-8415-306cc3d2a735" containerName="nova-metadata-metadata" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.906789 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4a03426-f037-45b9-8415-306cc3d2a735" containerName="nova-metadata-metadata" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.906802 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4a03426-f037-45b9-8415-306cc3d2a735" containerName="nova-metadata-log" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.907596 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.912485 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.913357 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.924987 4720 scope.go:117] "RemoveContainer" containerID="625aa9c2af709ec13f1a7d17354a7763d16e4128f3e89e12c9209faa9f02898b" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.930037 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.965291 4720 scope.go:117] "RemoveContainer" containerID="2179f70c0f1ed27cb4ffbaf9ad12b4230a9db29471ffdbbbcda8014c22b42615" Jan 21 14:50:30 crc kubenswrapper[4720]: E0121 14:50:30.965750 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2179f70c0f1ed27cb4ffbaf9ad12b4230a9db29471ffdbbbcda8014c22b42615\": container with ID starting with 2179f70c0f1ed27cb4ffbaf9ad12b4230a9db29471ffdbbbcda8014c22b42615 not found: ID does not exist" containerID="2179f70c0f1ed27cb4ffbaf9ad12b4230a9db29471ffdbbbcda8014c22b42615" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.965778 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2179f70c0f1ed27cb4ffbaf9ad12b4230a9db29471ffdbbbcda8014c22b42615"} err="failed to get container status \"2179f70c0f1ed27cb4ffbaf9ad12b4230a9db29471ffdbbbcda8014c22b42615\": rpc error: code = NotFound desc = could not find container \"2179f70c0f1ed27cb4ffbaf9ad12b4230a9db29471ffdbbbcda8014c22b42615\": container with ID starting with 2179f70c0f1ed27cb4ffbaf9ad12b4230a9db29471ffdbbbcda8014c22b42615 not found: ID does not exist" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.965798 4720 scope.go:117] "RemoveContainer" containerID="625aa9c2af709ec13f1a7d17354a7763d16e4128f3e89e12c9209faa9f02898b" Jan 21 14:50:30 crc kubenswrapper[4720]: E0121 14:50:30.965978 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"625aa9c2af709ec13f1a7d17354a7763d16e4128f3e89e12c9209faa9f02898b\": container with ID starting with 625aa9c2af709ec13f1a7d17354a7763d16e4128f3e89e12c9209faa9f02898b not found: ID does not exist" containerID="625aa9c2af709ec13f1a7d17354a7763d16e4128f3e89e12c9209faa9f02898b" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.965999 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"625aa9c2af709ec13f1a7d17354a7763d16e4128f3e89e12c9209faa9f02898b"} err="failed to get container status \"625aa9c2af709ec13f1a7d17354a7763d16e4128f3e89e12c9209faa9f02898b\": rpc error: code = NotFound desc = could not find container \"625aa9c2af709ec13f1a7d17354a7763d16e4128f3e89e12c9209faa9f02898b\": container with ID starting with 625aa9c2af709ec13f1a7d17354a7763d16e4128f3e89e12c9209faa9f02898b not found: ID does not exist" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.966012 4720 scope.go:117] "RemoveContainer" containerID="2179f70c0f1ed27cb4ffbaf9ad12b4230a9db29471ffdbbbcda8014c22b42615" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.966837 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2179f70c0f1ed27cb4ffbaf9ad12b4230a9db29471ffdbbbcda8014c22b42615"} err="failed to get container status \"2179f70c0f1ed27cb4ffbaf9ad12b4230a9db29471ffdbbbcda8014c22b42615\": rpc error: code = NotFound desc = could not find container \"2179f70c0f1ed27cb4ffbaf9ad12b4230a9db29471ffdbbbcda8014c22b42615\": container with ID starting with 2179f70c0f1ed27cb4ffbaf9ad12b4230a9db29471ffdbbbcda8014c22b42615 not found: ID does not exist" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.966887 4720 scope.go:117] "RemoveContainer" containerID="625aa9c2af709ec13f1a7d17354a7763d16e4128f3e89e12c9209faa9f02898b" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.967291 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"625aa9c2af709ec13f1a7d17354a7763d16e4128f3e89e12c9209faa9f02898b"} err="failed to get container status \"625aa9c2af709ec13f1a7d17354a7763d16e4128f3e89e12c9209faa9f02898b\": rpc error: code = NotFound desc = could not find container \"625aa9c2af709ec13f1a7d17354a7763d16e4128f3e89e12c9209faa9f02898b\": container with ID starting with 625aa9c2af709ec13f1a7d17354a7763d16e4128f3e89e12c9209faa9f02898b not found: ID does not exist" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:30.999894 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b5b74e2-e979-488c-a3aa-cdb564e41206-logs\") pod \"nova-metadata-0\" (UID: \"8b5b74e2-e979-488c-a3aa-cdb564e41206\") " pod="openstack/nova-metadata-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.000060 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b5b74e2-e979-488c-a3aa-cdb564e41206-config-data\") pod \"nova-metadata-0\" (UID: \"8b5b74e2-e979-488c-a3aa-cdb564e41206\") " pod="openstack/nova-metadata-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.000099 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh65s\" (UniqueName: \"kubernetes.io/projected/8b5b74e2-e979-488c-a3aa-cdb564e41206-kube-api-access-nh65s\") pod \"nova-metadata-0\" (UID: \"8b5b74e2-e979-488c-a3aa-cdb564e41206\") " pod="openstack/nova-metadata-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.000136 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b5b74e2-e979-488c-a3aa-cdb564e41206-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8b5b74e2-e979-488c-a3aa-cdb564e41206\") " pod="openstack/nova-metadata-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.000172 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b5b74e2-e979-488c-a3aa-cdb564e41206-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8b5b74e2-e979-488c-a3aa-cdb564e41206\") " pod="openstack/nova-metadata-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.004004 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.017821 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.017966 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.101830 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b5b74e2-e979-488c-a3aa-cdb564e41206-config-data\") pod \"nova-metadata-0\" (UID: \"8b5b74e2-e979-488c-a3aa-cdb564e41206\") " pod="openstack/nova-metadata-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.101881 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh65s\" (UniqueName: \"kubernetes.io/projected/8b5b74e2-e979-488c-a3aa-cdb564e41206-kube-api-access-nh65s\") pod \"nova-metadata-0\" (UID: \"8b5b74e2-e979-488c-a3aa-cdb564e41206\") " pod="openstack/nova-metadata-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.101910 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b5b74e2-e979-488c-a3aa-cdb564e41206-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8b5b74e2-e979-488c-a3aa-cdb564e41206\") " pod="openstack/nova-metadata-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.101936 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b5b74e2-e979-488c-a3aa-cdb564e41206-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8b5b74e2-e979-488c-a3aa-cdb564e41206\") " pod="openstack/nova-metadata-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.101987 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b5b74e2-e979-488c-a3aa-cdb564e41206-logs\") pod \"nova-metadata-0\" (UID: \"8b5b74e2-e979-488c-a3aa-cdb564e41206\") " pod="openstack/nova-metadata-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.102779 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b5b74e2-e979-488c-a3aa-cdb564e41206-logs\") pod \"nova-metadata-0\" (UID: \"8b5b74e2-e979-488c-a3aa-cdb564e41206\") " pod="openstack/nova-metadata-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.106548 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b5b74e2-e979-488c-a3aa-cdb564e41206-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8b5b74e2-e979-488c-a3aa-cdb564e41206\") " pod="openstack/nova-metadata-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.107259 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b5b74e2-e979-488c-a3aa-cdb564e41206-config-data\") pod \"nova-metadata-0\" (UID: \"8b5b74e2-e979-488c-a3aa-cdb564e41206\") " pod="openstack/nova-metadata-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.111099 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b5b74e2-e979-488c-a3aa-cdb564e41206-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8b5b74e2-e979-488c-a3aa-cdb564e41206\") " pod="openstack/nova-metadata-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.137861 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh65s\" (UniqueName: \"kubernetes.io/projected/8b5b74e2-e979-488c-a3aa-cdb564e41206-kube-api-access-nh65s\") pod \"nova-metadata-0\" (UID: \"8b5b74e2-e979-488c-a3aa-cdb564e41206\") " pod="openstack/nova-metadata-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.177354 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.177396 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.234935 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.239827 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.243273 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.362001 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-7f22v"] Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.362443 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" podUID="2a0b57dc-517a-404a-a47d-1f86009fad51" containerName="dnsmasq-dns" containerID="cri-o://332550904f3e433cd4d02f319dc6acd4e70218fd003f8a0d716e6b8b5738ed95" gracePeriod=10 Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.826664 4720 generic.go:334] "Generic (PLEG): container finished" podID="2a0b57dc-517a-404a-a47d-1f86009fad51" containerID="332550904f3e433cd4d02f319dc6acd4e70218fd003f8a0d716e6b8b5738ed95" exitCode=0 Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.826860 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" event={"ID":"2a0b57dc-517a-404a-a47d-1f86009fad51","Type":"ContainerDied","Data":"332550904f3e433cd4d02f319dc6acd4e70218fd003f8a0d716e6b8b5738ed95"} Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.886916 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.936430 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.062033 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="513a3b4c-405a-4045-a76b-acf59f0cfd3a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.170:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.062623 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="513a3b4c-405a-4045-a76b-acf59f0cfd3a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.170:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.069376 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.136588 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-dns-svc\") pod \"2a0b57dc-517a-404a-a47d-1f86009fad51\" (UID: \"2a0b57dc-517a-404a-a47d-1f86009fad51\") " Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.136635 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-ovsdbserver-sb\") pod \"2a0b57dc-517a-404a-a47d-1f86009fad51\" (UID: \"2a0b57dc-517a-404a-a47d-1f86009fad51\") " Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.136750 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bww8\" (UniqueName: \"kubernetes.io/projected/2a0b57dc-517a-404a-a47d-1f86009fad51-kube-api-access-8bww8\") pod \"2a0b57dc-517a-404a-a47d-1f86009fad51\" (UID: \"2a0b57dc-517a-404a-a47d-1f86009fad51\") " Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.136779 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-config\") pod \"2a0b57dc-517a-404a-a47d-1f86009fad51\" (UID: \"2a0b57dc-517a-404a-a47d-1f86009fad51\") " Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.136893 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-ovsdbserver-nb\") pod \"2a0b57dc-517a-404a-a47d-1f86009fad51\" (UID: \"2a0b57dc-517a-404a-a47d-1f86009fad51\") " Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.159705 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a0b57dc-517a-404a-a47d-1f86009fad51-kube-api-access-8bww8" (OuterVolumeSpecName: "kube-api-access-8bww8") pod "2a0b57dc-517a-404a-a47d-1f86009fad51" (UID: "2a0b57dc-517a-404a-a47d-1f86009fad51"). InnerVolumeSpecName "kube-api-access-8bww8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.241061 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bww8\" (UniqueName: \"kubernetes.io/projected/2a0b57dc-517a-404a-a47d-1f86009fad51-kube-api-access-8bww8\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.279173 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2a0b57dc-517a-404a-a47d-1f86009fad51" (UID: "2a0b57dc-517a-404a-a47d-1f86009fad51"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.284614 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2a0b57dc-517a-404a-a47d-1f86009fad51" (UID: "2a0b57dc-517a-404a-a47d-1f86009fad51"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.300170 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-config" (OuterVolumeSpecName: "config") pod "2a0b57dc-517a-404a-a47d-1f86009fad51" (UID: "2a0b57dc-517a-404a-a47d-1f86009fad51"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.308971 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2a0b57dc-517a-404a-a47d-1f86009fad51" (UID: "2a0b57dc-517a-404a-a47d-1f86009fad51"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.342417 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.342670 4720 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.342764 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.342851 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.713711 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4a03426-f037-45b9-8415-306cc3d2a735" path="/var/lib/kubelet/pods/b4a03426-f037-45b9-8415-306cc3d2a735/volumes" Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.835787 4720 generic.go:334] "Generic (PLEG): container finished" podID="b57a2637-15ee-4c59-881b-9364ffde9ffc" containerID="1cacd08b92a88ab371232f39ef9e5865d3573d5d8458ae4746910cd77bac3530" exitCode=0 Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.835935 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-jcm9t" event={"ID":"b57a2637-15ee-4c59-881b-9364ffde9ffc","Type":"ContainerDied","Data":"1cacd08b92a88ab371232f39ef9e5865d3573d5d8458ae4746910cd77bac3530"} Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.838341 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8b5b74e2-e979-488c-a3aa-cdb564e41206","Type":"ContainerStarted","Data":"3f32fcb17c187f66d65615dc419ba0052c0cdf7ee2c73f5696c11ca220e567b1"} Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.838506 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8b5b74e2-e979-488c-a3aa-cdb564e41206","Type":"ContainerStarted","Data":"ee550feb8e3a823073f178ad0f84667098adcf2f2a73bef1092302367c6a9e6e"} Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.838582 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8b5b74e2-e979-488c-a3aa-cdb564e41206","Type":"ContainerStarted","Data":"0c16d2b92ff3ecd28762fa538726ba17cdec4aab7f351b70cec6f1779e6f1d03"} Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.844382 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" event={"ID":"2a0b57dc-517a-404a-a47d-1f86009fad51","Type":"ContainerDied","Data":"c28c627632464fe77ab3019eb5addeb203e6d652bffef45ba63964d4aabbdd0c"} Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.844415 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.844746 4720 scope.go:117] "RemoveContainer" containerID="332550904f3e433cd4d02f319dc6acd4e70218fd003f8a0d716e6b8b5738ed95" Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.871566 4720 scope.go:117] "RemoveContainer" containerID="4d8b9a33cc2b4409a467cae14fe05fabf4e1586debbfc3178a4978e092725506" Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.943854 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.943837246 podStartE2EDuration="2.943837246s" podCreationTimestamp="2026-01-21 14:50:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:50:32.921783046 +0000 UTC m=+1270.830522978" watchObservedRunningTime="2026-01-21 14:50:32.943837246 +0000 UTC m=+1270.852577178" Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.945541 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-7f22v"] Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.974496 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-7f22v"] Jan 21 14:50:33 crc kubenswrapper[4720]: I0121 14:50:33.857756 4720 generic.go:334] "Generic (PLEG): container finished" podID="5dd7d19f-79e4-47c9-9934-cc003fe551db" containerID="2656b4700e21f0c4fe6d2a6022d5d04628debe20176c13e5a7ff671b4ef6cfd2" exitCode=0 Jan 21 14:50:33 crc kubenswrapper[4720]: I0121 14:50:33.858006 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dd7d19f-79e4-47c9-9934-cc003fe551db","Type":"ContainerDied","Data":"2656b4700e21f0c4fe6d2a6022d5d04628debe20176c13e5a7ff671b4ef6cfd2"} Jan 21 14:50:33 crc kubenswrapper[4720]: I0121 14:50:33.858204 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dd7d19f-79e4-47c9-9934-cc003fe551db","Type":"ContainerDied","Data":"cb3753944b96c83d03a1863c25acebd264308098d7a3d05463fb79438fec08af"} Jan 21 14:50:33 crc kubenswrapper[4720]: I0121 14:50:33.858223 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb3753944b96c83d03a1863c25acebd264308098d7a3d05463fb79438fec08af" Jan 21 14:50:33 crc kubenswrapper[4720]: I0121 14:50:33.942936 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:50:33 crc kubenswrapper[4720]: I0121 14:50:33.973595 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-sg-core-conf-yaml\") pod \"5dd7d19f-79e4-47c9-9934-cc003fe551db\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " Jan 21 14:50:33 crc kubenswrapper[4720]: I0121 14:50:33.973724 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-config-data\") pod \"5dd7d19f-79e4-47c9-9934-cc003fe551db\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " Jan 21 14:50:33 crc kubenswrapper[4720]: I0121 14:50:33.973834 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-combined-ca-bundle\") pod \"5dd7d19f-79e4-47c9-9934-cc003fe551db\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " Jan 21 14:50:33 crc kubenswrapper[4720]: I0121 14:50:33.973856 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dd7d19f-79e4-47c9-9934-cc003fe551db-log-httpd\") pod \"5dd7d19f-79e4-47c9-9934-cc003fe551db\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " Jan 21 14:50:33 crc kubenswrapper[4720]: I0121 14:50:33.973905 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ggb5\" (UniqueName: \"kubernetes.io/projected/5dd7d19f-79e4-47c9-9934-cc003fe551db-kube-api-access-7ggb5\") pod \"5dd7d19f-79e4-47c9-9934-cc003fe551db\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " Jan 21 14:50:33 crc kubenswrapper[4720]: I0121 14:50:33.973942 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dd7d19f-79e4-47c9-9934-cc003fe551db-run-httpd\") pod \"5dd7d19f-79e4-47c9-9934-cc003fe551db\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " Jan 21 14:50:33 crc kubenswrapper[4720]: I0121 14:50:33.973989 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-scripts\") pod \"5dd7d19f-79e4-47c9-9934-cc003fe551db\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " Jan 21 14:50:33 crc kubenswrapper[4720]: I0121 14:50:33.980788 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dd7d19f-79e4-47c9-9934-cc003fe551db-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5dd7d19f-79e4-47c9-9934-cc003fe551db" (UID: "5dd7d19f-79e4-47c9-9934-cc003fe551db"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:50:33 crc kubenswrapper[4720]: I0121 14:50:33.981560 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dd7d19f-79e4-47c9-9934-cc003fe551db-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5dd7d19f-79e4-47c9-9934-cc003fe551db" (UID: "5dd7d19f-79e4-47c9-9934-cc003fe551db"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:50:33 crc kubenswrapper[4720]: I0121 14:50:33.992944 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-scripts" (OuterVolumeSpecName: "scripts") pod "5dd7d19f-79e4-47c9-9934-cc003fe551db" (UID: "5dd7d19f-79e4-47c9-9934-cc003fe551db"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:33 crc kubenswrapper[4720]: I0121 14:50:33.994785 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dd7d19f-79e4-47c9-9934-cc003fe551db-kube-api-access-7ggb5" (OuterVolumeSpecName: "kube-api-access-7ggb5") pod "5dd7d19f-79e4-47c9-9934-cc003fe551db" (UID: "5dd7d19f-79e4-47c9-9934-cc003fe551db"). InnerVolumeSpecName "kube-api-access-7ggb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.020781 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5dd7d19f-79e4-47c9-9934-cc003fe551db" (UID: "5dd7d19f-79e4-47c9-9934-cc003fe551db"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.075405 4720 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dd7d19f-79e4-47c9-9934-cc003fe551db-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.075433 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ggb5\" (UniqueName: \"kubernetes.io/projected/5dd7d19f-79e4-47c9-9934-cc003fe551db-kube-api-access-7ggb5\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.075442 4720 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dd7d19f-79e4-47c9-9934-cc003fe551db-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.075450 4720 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.075458 4720 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.136911 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5dd7d19f-79e4-47c9-9934-cc003fe551db" (UID: "5dd7d19f-79e4-47c9-9934-cc003fe551db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.159864 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-config-data" (OuterVolumeSpecName: "config-data") pod "5dd7d19f-79e4-47c9-9934-cc003fe551db" (UID: "5dd7d19f-79e4-47c9-9934-cc003fe551db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.181725 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.181788 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.336986 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-jcm9t" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.385578 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b57a2637-15ee-4c59-881b-9364ffde9ffc-config-data\") pod \"b57a2637-15ee-4c59-881b-9364ffde9ffc\" (UID: \"b57a2637-15ee-4c59-881b-9364ffde9ffc\") " Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.386009 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b57a2637-15ee-4c59-881b-9364ffde9ffc-combined-ca-bundle\") pod \"b57a2637-15ee-4c59-881b-9364ffde9ffc\" (UID: \"b57a2637-15ee-4c59-881b-9364ffde9ffc\") " Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.386045 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9w4hg\" (UniqueName: \"kubernetes.io/projected/b57a2637-15ee-4c59-881b-9364ffde9ffc-kube-api-access-9w4hg\") pod \"b57a2637-15ee-4c59-881b-9364ffde9ffc\" (UID: \"b57a2637-15ee-4c59-881b-9364ffde9ffc\") " Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.386112 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b57a2637-15ee-4c59-881b-9364ffde9ffc-scripts\") pod \"b57a2637-15ee-4c59-881b-9364ffde9ffc\" (UID: \"b57a2637-15ee-4c59-881b-9364ffde9ffc\") " Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.401774 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b57a2637-15ee-4c59-881b-9364ffde9ffc-scripts" (OuterVolumeSpecName: "scripts") pod "b57a2637-15ee-4c59-881b-9364ffde9ffc" (UID: "b57a2637-15ee-4c59-881b-9364ffde9ffc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.408852 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b57a2637-15ee-4c59-881b-9364ffde9ffc-kube-api-access-9w4hg" (OuterVolumeSpecName: "kube-api-access-9w4hg") pod "b57a2637-15ee-4c59-881b-9364ffde9ffc" (UID: "b57a2637-15ee-4c59-881b-9364ffde9ffc"). InnerVolumeSpecName "kube-api-access-9w4hg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.445497 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b57a2637-15ee-4c59-881b-9364ffde9ffc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b57a2637-15ee-4c59-881b-9364ffde9ffc" (UID: "b57a2637-15ee-4c59-881b-9364ffde9ffc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.445589 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b57a2637-15ee-4c59-881b-9364ffde9ffc-config-data" (OuterVolumeSpecName: "config-data") pod "b57a2637-15ee-4c59-881b-9364ffde9ffc" (UID: "b57a2637-15ee-4c59-881b-9364ffde9ffc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.489022 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b57a2637-15ee-4c59-881b-9364ffde9ffc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.489065 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9w4hg\" (UniqueName: \"kubernetes.io/projected/b57a2637-15ee-4c59-881b-9364ffde9ffc-kube-api-access-9w4hg\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.489084 4720 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b57a2637-15ee-4c59-881b-9364ffde9ffc-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.489109 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b57a2637-15ee-4c59-881b-9364ffde9ffc-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.687522 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a0b57dc-517a-404a-a47d-1f86009fad51" path="/var/lib/kubelet/pods/2a0b57dc-517a-404a-a47d-1f86009fad51/volumes" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.868089 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.869072 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-jcm9t" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.869487 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-jcm9t" event={"ID":"b57a2637-15ee-4c59-881b-9364ffde9ffc","Type":"ContainerDied","Data":"fd968c05b8bb02b11dbdb28cde7af3788cf08c2b3fd8fe15c592bc03162e9a7c"} Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.869529 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd968c05b8bb02b11dbdb28cde7af3788cf08c2b3fd8fe15c592bc03162e9a7c" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.894349 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.904602 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.947389 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:50:34 crc kubenswrapper[4720]: E0121 14:50:34.947903 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b57a2637-15ee-4c59-881b-9364ffde9ffc" containerName="nova-manage" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.947928 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="b57a2637-15ee-4c59-881b-9364ffde9ffc" containerName="nova-manage" Jan 21 14:50:34 crc kubenswrapper[4720]: E0121 14:50:34.947942 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dd7d19f-79e4-47c9-9934-cc003fe551db" containerName="proxy-httpd" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.947952 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd7d19f-79e4-47c9-9934-cc003fe551db" containerName="proxy-httpd" Jan 21 14:50:34 crc kubenswrapper[4720]: E0121 14:50:34.947964 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dd7d19f-79e4-47c9-9934-cc003fe551db" containerName="sg-core" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.947970 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd7d19f-79e4-47c9-9934-cc003fe551db" containerName="sg-core" Jan 21 14:50:34 crc kubenswrapper[4720]: E0121 14:50:34.947992 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a0b57dc-517a-404a-a47d-1f86009fad51" containerName="dnsmasq-dns" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.947999 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a0b57dc-517a-404a-a47d-1f86009fad51" containerName="dnsmasq-dns" Jan 21 14:50:34 crc kubenswrapper[4720]: E0121 14:50:34.948012 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dd7d19f-79e4-47c9-9934-cc003fe551db" containerName="ceilometer-central-agent" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.948020 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd7d19f-79e4-47c9-9934-cc003fe551db" containerName="ceilometer-central-agent" Jan 21 14:50:34 crc kubenswrapper[4720]: E0121 14:50:34.948033 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dd7d19f-79e4-47c9-9934-cc003fe551db" containerName="ceilometer-notification-agent" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.948041 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd7d19f-79e4-47c9-9934-cc003fe551db" containerName="ceilometer-notification-agent" Jan 21 14:50:34 crc kubenswrapper[4720]: E0121 14:50:34.948057 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a0b57dc-517a-404a-a47d-1f86009fad51" containerName="init" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.948065 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a0b57dc-517a-404a-a47d-1f86009fad51" containerName="init" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.948252 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dd7d19f-79e4-47c9-9934-cc003fe551db" containerName="proxy-httpd" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.948268 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dd7d19f-79e4-47c9-9934-cc003fe551db" containerName="ceilometer-central-agent" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.948277 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="b57a2637-15ee-4c59-881b-9364ffde9ffc" containerName="nova-manage" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.948287 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dd7d19f-79e4-47c9-9934-cc003fe551db" containerName="sg-core" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.948300 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dd7d19f-79e4-47c9-9934-cc003fe551db" containerName="ceilometer-notification-agent" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.948310 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a0b57dc-517a-404a-a47d-1f86009fad51" containerName="dnsmasq-dns" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.950145 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.954028 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.954248 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.954376 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.965719 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.995908 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-scripts\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.996170 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.996299 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12d3be7d-16ff-43df-a7d5-266f2b1d4308-run-httpd\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.996452 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12d3be7d-16ff-43df-a7d5-266f2b1d4308-log-httpd\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.996566 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.996699 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-config-data\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.996828 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fwtp\" (UniqueName: \"kubernetes.io/projected/12d3be7d-16ff-43df-a7d5-266f2b1d4308-kube-api-access-2fwtp\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.996946 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.062682 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.062918 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="513a3b4c-405a-4045-a76b-acf59f0cfd3a" containerName="nova-api-log" containerID="cri-o://3e9948de7472ed8920759f420aefe5ec7761ed425b16da329167d68730b928c1" gracePeriod=30 Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.063051 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="513a3b4c-405a-4045-a76b-acf59f0cfd3a" containerName="nova-api-api" containerID="cri-o://1a7aba18286d833bce25e8e85b8d4351c6d249b6433918eb9ba77d5c982b9b45" gracePeriod=30 Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.084823 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.085397 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="684b734f-b6c9-47c9-a8e6-696eb7b0e5d6" containerName="nova-scheduler-scheduler" containerID="cri-o://aa4eba24f81a8563f110a15ad5721600a8508111b6e91a07d196d385d0352fe7" gracePeriod=30 Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.098182 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12d3be7d-16ff-43df-a7d5-266f2b1d4308-log-httpd\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.098267 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.098290 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-config-data\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.098317 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fwtp\" (UniqueName: \"kubernetes.io/projected/12d3be7d-16ff-43df-a7d5-266f2b1d4308-kube-api-access-2fwtp\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.098344 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.098379 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-scripts\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.098403 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.098437 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12d3be7d-16ff-43df-a7d5-266f2b1d4308-run-httpd\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.098836 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12d3be7d-16ff-43df-a7d5-266f2b1d4308-log-httpd\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.100133 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12d3be7d-16ff-43df-a7d5-266f2b1d4308-run-httpd\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.122217 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.122519 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-config-data\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.123384 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.124499 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.125276 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-scripts\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.130142 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.131267 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8b5b74e2-e979-488c-a3aa-cdb564e41206" containerName="nova-metadata-metadata" containerID="cri-o://3f32fcb17c187f66d65615dc419ba0052c0cdf7ee2c73f5696c11ca220e567b1" gracePeriod=30 Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.130559 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8b5b74e2-e979-488c-a3aa-cdb564e41206" containerName="nova-metadata-log" containerID="cri-o://ee550feb8e3a823073f178ad0f84667098adcf2f2a73bef1092302367c6a9e6e" gracePeriod=30 Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.155337 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fwtp\" (UniqueName: \"kubernetes.io/projected/12d3be7d-16ff-43df-a7d5-266f2b1d4308-kube-api-access-2fwtp\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.280283 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.796933 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.877525 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12d3be7d-16ff-43df-a7d5-266f2b1d4308","Type":"ContainerStarted","Data":"095eb50a9535270990dd51b698c9cf80b5e404f52878168a52724d7ce2256d5b"} Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.879449 4720 generic.go:334] "Generic (PLEG): container finished" podID="8b5b74e2-e979-488c-a3aa-cdb564e41206" containerID="ee550feb8e3a823073f178ad0f84667098adcf2f2a73bef1092302367c6a9e6e" exitCode=143 Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.879515 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8b5b74e2-e979-488c-a3aa-cdb564e41206","Type":"ContainerDied","Data":"ee550feb8e3a823073f178ad0f84667098adcf2f2a73bef1092302367c6a9e6e"} Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.881584 4720 generic.go:334] "Generic (PLEG): container finished" podID="513a3b4c-405a-4045-a76b-acf59f0cfd3a" containerID="3e9948de7472ed8920759f420aefe5ec7761ed425b16da329167d68730b928c1" exitCode=143 Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.881633 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"513a3b4c-405a-4045-a76b-acf59f0cfd3a","Type":"ContainerDied","Data":"3e9948de7472ed8920759f420aefe5ec7761ed425b16da329167d68730b928c1"} Jan 21 14:50:36 crc kubenswrapper[4720]: E0121 14:50:36.179794 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aa4eba24f81a8563f110a15ad5721600a8508111b6e91a07d196d385d0352fe7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 14:50:36 crc kubenswrapper[4720]: E0121 14:50:36.185528 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aa4eba24f81a8563f110a15ad5721600a8508111b6e91a07d196d385d0352fe7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 14:50:36 crc kubenswrapper[4720]: E0121 14:50:36.187428 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aa4eba24f81a8563f110a15ad5721600a8508111b6e91a07d196d385d0352fe7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 14:50:36 crc kubenswrapper[4720]: E0121 14:50:36.187469 4720 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="684b734f-b6c9-47c9-a8e6-696eb7b0e5d6" containerName="nova-scheduler-scheduler" Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.243724 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.243804 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.688787 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dd7d19f-79e4-47c9-9934-cc003fe551db" path="/var/lib/kubelet/pods/5dd7d19f-79e4-47c9-9934-cc003fe551db/volumes" Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.797831 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.839756 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b5b74e2-e979-488c-a3aa-cdb564e41206-combined-ca-bundle\") pod \"8b5b74e2-e979-488c-a3aa-cdb564e41206\" (UID: \"8b5b74e2-e979-488c-a3aa-cdb564e41206\") " Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.839820 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b5b74e2-e979-488c-a3aa-cdb564e41206-logs\") pod \"8b5b74e2-e979-488c-a3aa-cdb564e41206\" (UID: \"8b5b74e2-e979-488c-a3aa-cdb564e41206\") " Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.839890 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh65s\" (UniqueName: \"kubernetes.io/projected/8b5b74e2-e979-488c-a3aa-cdb564e41206-kube-api-access-nh65s\") pod \"8b5b74e2-e979-488c-a3aa-cdb564e41206\" (UID: \"8b5b74e2-e979-488c-a3aa-cdb564e41206\") " Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.839960 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b5b74e2-e979-488c-a3aa-cdb564e41206-nova-metadata-tls-certs\") pod \"8b5b74e2-e979-488c-a3aa-cdb564e41206\" (UID: \"8b5b74e2-e979-488c-a3aa-cdb564e41206\") " Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.840011 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b5b74e2-e979-488c-a3aa-cdb564e41206-config-data\") pod \"8b5b74e2-e979-488c-a3aa-cdb564e41206\" (UID: \"8b5b74e2-e979-488c-a3aa-cdb564e41206\") " Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.841092 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b5b74e2-e979-488c-a3aa-cdb564e41206-logs" (OuterVolumeSpecName: "logs") pod "8b5b74e2-e979-488c-a3aa-cdb564e41206" (UID: "8b5b74e2-e979-488c-a3aa-cdb564e41206"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.860151 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b5b74e2-e979-488c-a3aa-cdb564e41206-kube-api-access-nh65s" (OuterVolumeSpecName: "kube-api-access-nh65s") pod "8b5b74e2-e979-488c-a3aa-cdb564e41206" (UID: "8b5b74e2-e979-488c-a3aa-cdb564e41206"). InnerVolumeSpecName "kube-api-access-nh65s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.867392 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b5b74e2-e979-488c-a3aa-cdb564e41206-config-data" (OuterVolumeSpecName: "config-data") pod "8b5b74e2-e979-488c-a3aa-cdb564e41206" (UID: "8b5b74e2-e979-488c-a3aa-cdb564e41206"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.887618 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b5b74e2-e979-488c-a3aa-cdb564e41206-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b5b74e2-e979-488c-a3aa-cdb564e41206" (UID: "8b5b74e2-e979-488c-a3aa-cdb564e41206"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.905638 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12d3be7d-16ff-43df-a7d5-266f2b1d4308","Type":"ContainerStarted","Data":"5d5bd9374ec63c57d2fd3d7df27f609498666bfefedad7b65b9e20757f2269d6"} Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.917320 4720 generic.go:334] "Generic (PLEG): container finished" podID="8b5b74e2-e979-488c-a3aa-cdb564e41206" containerID="3f32fcb17c187f66d65615dc419ba0052c0cdf7ee2c73f5696c11ca220e567b1" exitCode=0 Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.917388 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8b5b74e2-e979-488c-a3aa-cdb564e41206","Type":"ContainerDied","Data":"3f32fcb17c187f66d65615dc419ba0052c0cdf7ee2c73f5696c11ca220e567b1"} Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.917542 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8b5b74e2-e979-488c-a3aa-cdb564e41206","Type":"ContainerDied","Data":"0c16d2b92ff3ecd28762fa538726ba17cdec4aab7f351b70cec6f1779e6f1d03"} Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.917567 4720 scope.go:117] "RemoveContainer" containerID="3f32fcb17c187f66d65615dc419ba0052c0cdf7ee2c73f5696c11ca220e567b1" Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.918464 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.936307 4720 generic.go:334] "Generic (PLEG): container finished" podID="6f23517c-a9a1-4740-8b3b-d42b40cc8bc7" containerID="0cf3fdd52f65dc4830c6503325bd2251a454cdee26406e44f66cb14c6ec26e1c" exitCode=0 Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.936634 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wwmwq" event={"ID":"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7","Type":"ContainerDied","Data":"0cf3fdd52f65dc4830c6503325bd2251a454cdee26406e44f66cb14c6ec26e1c"} Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.942137 4720 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b5b74e2-e979-488c-a3aa-cdb564e41206-logs\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.942164 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh65s\" (UniqueName: \"kubernetes.io/projected/8b5b74e2-e979-488c-a3aa-cdb564e41206-kube-api-access-nh65s\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.942175 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b5b74e2-e979-488c-a3aa-cdb564e41206-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.942186 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b5b74e2-e979-488c-a3aa-cdb564e41206-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.960891 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b5b74e2-e979-488c-a3aa-cdb564e41206-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "8b5b74e2-e979-488c-a3aa-cdb564e41206" (UID: "8b5b74e2-e979-488c-a3aa-cdb564e41206"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.965769 4720 scope.go:117] "RemoveContainer" containerID="ee550feb8e3a823073f178ad0f84667098adcf2f2a73bef1092302367c6a9e6e" Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.988568 4720 scope.go:117] "RemoveContainer" containerID="3f32fcb17c187f66d65615dc419ba0052c0cdf7ee2c73f5696c11ca220e567b1" Jan 21 14:50:36 crc kubenswrapper[4720]: E0121 14:50:36.989360 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f32fcb17c187f66d65615dc419ba0052c0cdf7ee2c73f5696c11ca220e567b1\": container with ID starting with 3f32fcb17c187f66d65615dc419ba0052c0cdf7ee2c73f5696c11ca220e567b1 not found: ID does not exist" containerID="3f32fcb17c187f66d65615dc419ba0052c0cdf7ee2c73f5696c11ca220e567b1" Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.989404 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f32fcb17c187f66d65615dc419ba0052c0cdf7ee2c73f5696c11ca220e567b1"} err="failed to get container status \"3f32fcb17c187f66d65615dc419ba0052c0cdf7ee2c73f5696c11ca220e567b1\": rpc error: code = NotFound desc = could not find container \"3f32fcb17c187f66d65615dc419ba0052c0cdf7ee2c73f5696c11ca220e567b1\": container with ID starting with 3f32fcb17c187f66d65615dc419ba0052c0cdf7ee2c73f5696c11ca220e567b1 not found: ID does not exist" Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.989432 4720 scope.go:117] "RemoveContainer" containerID="ee550feb8e3a823073f178ad0f84667098adcf2f2a73bef1092302367c6a9e6e" Jan 21 14:50:36 crc kubenswrapper[4720]: E0121 14:50:36.990671 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee550feb8e3a823073f178ad0f84667098adcf2f2a73bef1092302367c6a9e6e\": container with ID starting with ee550feb8e3a823073f178ad0f84667098adcf2f2a73bef1092302367c6a9e6e not found: ID does not exist" containerID="ee550feb8e3a823073f178ad0f84667098adcf2f2a73bef1092302367c6a9e6e" Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.990695 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee550feb8e3a823073f178ad0f84667098adcf2f2a73bef1092302367c6a9e6e"} err="failed to get container status \"ee550feb8e3a823073f178ad0f84667098adcf2f2a73bef1092302367c6a9e6e\": rpc error: code = NotFound desc = could not find container \"ee550feb8e3a823073f178ad0f84667098adcf2f2a73bef1092302367c6a9e6e\": container with ID starting with ee550feb8e3a823073f178ad0f84667098adcf2f2a73bef1092302367c6a9e6e not found: ID does not exist" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.044605 4720 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b5b74e2-e979-488c-a3aa-cdb564e41206-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.261752 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.272022 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.281315 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:50:37 crc kubenswrapper[4720]: E0121 14:50:37.282227 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b5b74e2-e979-488c-a3aa-cdb564e41206" containerName="nova-metadata-log" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.282332 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b5b74e2-e979-488c-a3aa-cdb564e41206" containerName="nova-metadata-log" Jan 21 14:50:37 crc kubenswrapper[4720]: E0121 14:50:37.282407 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b5b74e2-e979-488c-a3aa-cdb564e41206" containerName="nova-metadata-metadata" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.282465 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b5b74e2-e979-488c-a3aa-cdb564e41206" containerName="nova-metadata-metadata" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.282936 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b5b74e2-e979-488c-a3aa-cdb564e41206" containerName="nova-metadata-log" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.283024 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b5b74e2-e979-488c-a3aa-cdb564e41206" containerName="nova-metadata-metadata" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.286084 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.288885 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.289238 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.301515 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.352942 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/684b734f-b6c9-47c9-a8e6-696eb7b0e5d6-combined-ca-bundle\") pod \"684b734f-b6c9-47c9-a8e6-696eb7b0e5d6\" (UID: \"684b734f-b6c9-47c9-a8e6-696eb7b0e5d6\") " Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.353282 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/684b734f-b6c9-47c9-a8e6-696eb7b0e5d6-config-data\") pod \"684b734f-b6c9-47c9-a8e6-696eb7b0e5d6\" (UID: \"684b734f-b6c9-47c9-a8e6-696eb7b0e5d6\") " Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.354073 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwpkl\" (UniqueName: \"kubernetes.io/projected/684b734f-b6c9-47c9-a8e6-696eb7b0e5d6-kube-api-access-fwpkl\") pod \"684b734f-b6c9-47c9-a8e6-696eb7b0e5d6\" (UID: \"684b734f-b6c9-47c9-a8e6-696eb7b0e5d6\") " Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.356145 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc263e55-641f-47c7-ac02-f863d7cafa11-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cc263e55-641f-47c7-ac02-f863d7cafa11\") " pod="openstack/nova-metadata-0" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.356248 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs682\" (UniqueName: \"kubernetes.io/projected/cc263e55-641f-47c7-ac02-f863d7cafa11-kube-api-access-vs682\") pod \"nova-metadata-0\" (UID: \"cc263e55-641f-47c7-ac02-f863d7cafa11\") " pod="openstack/nova-metadata-0" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.356520 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc263e55-641f-47c7-ac02-f863d7cafa11-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cc263e55-641f-47c7-ac02-f863d7cafa11\") " pod="openstack/nova-metadata-0" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.356679 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc263e55-641f-47c7-ac02-f863d7cafa11-config-data\") pod \"nova-metadata-0\" (UID: \"cc263e55-641f-47c7-ac02-f863d7cafa11\") " pod="openstack/nova-metadata-0" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.356825 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc263e55-641f-47c7-ac02-f863d7cafa11-logs\") pod \"nova-metadata-0\" (UID: \"cc263e55-641f-47c7-ac02-f863d7cafa11\") " pod="openstack/nova-metadata-0" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.377992 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.379482 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/684b734f-b6c9-47c9-a8e6-696eb7b0e5d6-kube-api-access-fwpkl" (OuterVolumeSpecName: "kube-api-access-fwpkl") pod "684b734f-b6c9-47c9-a8e6-696eb7b0e5d6" (UID: "684b734f-b6c9-47c9-a8e6-696eb7b0e5d6"). InnerVolumeSpecName "kube-api-access-fwpkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.431095 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/684b734f-b6c9-47c9-a8e6-696eb7b0e5d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "684b734f-b6c9-47c9-a8e6-696eb7b0e5d6" (UID: "684b734f-b6c9-47c9-a8e6-696eb7b0e5d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.444332 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/684b734f-b6c9-47c9-a8e6-696eb7b0e5d6-config-data" (OuterVolumeSpecName: "config-data") pod "684b734f-b6c9-47c9-a8e6-696eb7b0e5d6" (UID: "684b734f-b6c9-47c9-a8e6-696eb7b0e5d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.461001 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc263e55-641f-47c7-ac02-f863d7cafa11-logs\") pod \"nova-metadata-0\" (UID: \"cc263e55-641f-47c7-ac02-f863d7cafa11\") " pod="openstack/nova-metadata-0" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.461211 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc263e55-641f-47c7-ac02-f863d7cafa11-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cc263e55-641f-47c7-ac02-f863d7cafa11\") " pod="openstack/nova-metadata-0" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.461835 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vs682\" (UniqueName: \"kubernetes.io/projected/cc263e55-641f-47c7-ac02-f863d7cafa11-kube-api-access-vs682\") pod \"nova-metadata-0\" (UID: \"cc263e55-641f-47c7-ac02-f863d7cafa11\") " pod="openstack/nova-metadata-0" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.461974 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc263e55-641f-47c7-ac02-f863d7cafa11-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cc263e55-641f-47c7-ac02-f863d7cafa11\") " pod="openstack/nova-metadata-0" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.462067 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc263e55-641f-47c7-ac02-f863d7cafa11-config-data\") pod \"nova-metadata-0\" (UID: \"cc263e55-641f-47c7-ac02-f863d7cafa11\") " pod="openstack/nova-metadata-0" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.462204 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/684b734f-b6c9-47c9-a8e6-696eb7b0e5d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.462275 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/684b734f-b6c9-47c9-a8e6-696eb7b0e5d6-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.462401 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwpkl\" (UniqueName: \"kubernetes.io/projected/684b734f-b6c9-47c9-a8e6-696eb7b0e5d6-kube-api-access-fwpkl\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.461840 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc263e55-641f-47c7-ac02-f863d7cafa11-logs\") pod \"nova-metadata-0\" (UID: \"cc263e55-641f-47c7-ac02-f863d7cafa11\") " pod="openstack/nova-metadata-0" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.466119 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc263e55-641f-47c7-ac02-f863d7cafa11-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cc263e55-641f-47c7-ac02-f863d7cafa11\") " pod="openstack/nova-metadata-0" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.466594 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc263e55-641f-47c7-ac02-f863d7cafa11-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cc263e55-641f-47c7-ac02-f863d7cafa11\") " pod="openstack/nova-metadata-0" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.468490 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc263e55-641f-47c7-ac02-f863d7cafa11-config-data\") pod \"nova-metadata-0\" (UID: \"cc263e55-641f-47c7-ac02-f863d7cafa11\") " pod="openstack/nova-metadata-0" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.476026 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs682\" (UniqueName: \"kubernetes.io/projected/cc263e55-641f-47c7-ac02-f863d7cafa11-kube-api-access-vs682\") pod \"nova-metadata-0\" (UID: \"cc263e55-641f-47c7-ac02-f863d7cafa11\") " pod="openstack/nova-metadata-0" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.617711 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.953061 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12d3be7d-16ff-43df-a7d5-266f2b1d4308","Type":"ContainerStarted","Data":"2e4657d7e4e72bd0317d9a4097deb7850089e8ed3ac24d33b7e7cfaf4e9621ab"} Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.955424 4720 generic.go:334] "Generic (PLEG): container finished" podID="684b734f-b6c9-47c9-a8e6-696eb7b0e5d6" containerID="aa4eba24f81a8563f110a15ad5721600a8508111b6e91a07d196d385d0352fe7" exitCode=0 Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.955907 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.958347 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"684b734f-b6c9-47c9-a8e6-696eb7b0e5d6","Type":"ContainerDied","Data":"aa4eba24f81a8563f110a15ad5721600a8508111b6e91a07d196d385d0352fe7"} Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.958399 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"684b734f-b6c9-47c9-a8e6-696eb7b0e5d6","Type":"ContainerDied","Data":"0cb0e309b39bce610a72ded1405c14599f22a6c20641e56fd95873ffd0658fca"} Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.958415 4720 scope.go:117] "RemoveContainer" containerID="aa4eba24f81a8563f110a15ad5721600a8508111b6e91a07d196d385d0352fe7" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.018876 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.050308 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.059984 4720 scope.go:117] "RemoveContainer" containerID="aa4eba24f81a8563f110a15ad5721600a8508111b6e91a07d196d385d0352fe7" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.062774 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 14:50:38 crc kubenswrapper[4720]: E0121 14:50:38.063241 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="684b734f-b6c9-47c9-a8e6-696eb7b0e5d6" containerName="nova-scheduler-scheduler" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.063255 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="684b734f-b6c9-47c9-a8e6-696eb7b0e5d6" containerName="nova-scheduler-scheduler" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.063431 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="684b734f-b6c9-47c9-a8e6-696eb7b0e5d6" containerName="nova-scheduler-scheduler" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.064046 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 21 14:50:38 crc kubenswrapper[4720]: E0121 14:50:38.064609 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa4eba24f81a8563f110a15ad5721600a8508111b6e91a07d196d385d0352fe7\": container with ID starting with aa4eba24f81a8563f110a15ad5721600a8508111b6e91a07d196d385d0352fe7 not found: ID does not exist" containerID="aa4eba24f81a8563f110a15ad5721600a8508111b6e91a07d196d385d0352fe7" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.064631 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa4eba24f81a8563f110a15ad5721600a8508111b6e91a07d196d385d0352fe7"} err="failed to get container status \"aa4eba24f81a8563f110a15ad5721600a8508111b6e91a07d196d385d0352fe7\": rpc error: code = NotFound desc = could not find container \"aa4eba24f81a8563f110a15ad5721600a8508111b6e91a07d196d385d0352fe7\": container with ID starting with aa4eba24f81a8563f110a15ad5721600a8508111b6e91a07d196d385d0352fe7 not found: ID does not exist" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.069358 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.088097 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.176466 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c65466e3-8bac-41f3-855f-202b0a6f9e82-config-data\") pod \"nova-scheduler-0\" (UID: \"c65466e3-8bac-41f3-855f-202b0a6f9e82\") " pod="openstack/nova-scheduler-0" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.176569 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gw47\" (UniqueName: \"kubernetes.io/projected/c65466e3-8bac-41f3-855f-202b0a6f9e82-kube-api-access-5gw47\") pod \"nova-scheduler-0\" (UID: \"c65466e3-8bac-41f3-855f-202b0a6f9e82\") " pod="openstack/nova-scheduler-0" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.176638 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c65466e3-8bac-41f3-855f-202b0a6f9e82-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c65466e3-8bac-41f3-855f-202b0a6f9e82\") " pod="openstack/nova-scheduler-0" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.240236 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.277731 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c65466e3-8bac-41f3-855f-202b0a6f9e82-config-data\") pod \"nova-scheduler-0\" (UID: \"c65466e3-8bac-41f3-855f-202b0a6f9e82\") " pod="openstack/nova-scheduler-0" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.277831 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gw47\" (UniqueName: \"kubernetes.io/projected/c65466e3-8bac-41f3-855f-202b0a6f9e82-kube-api-access-5gw47\") pod \"nova-scheduler-0\" (UID: \"c65466e3-8bac-41f3-855f-202b0a6f9e82\") " pod="openstack/nova-scheduler-0" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.277888 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c65466e3-8bac-41f3-855f-202b0a6f9e82-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c65466e3-8bac-41f3-855f-202b0a6f9e82\") " pod="openstack/nova-scheduler-0" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.283204 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c65466e3-8bac-41f3-855f-202b0a6f9e82-config-data\") pod \"nova-scheduler-0\" (UID: \"c65466e3-8bac-41f3-855f-202b0a6f9e82\") " pod="openstack/nova-scheduler-0" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.285261 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c65466e3-8bac-41f3-855f-202b0a6f9e82-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c65466e3-8bac-41f3-855f-202b0a6f9e82\") " pod="openstack/nova-scheduler-0" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.296408 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gw47\" (UniqueName: \"kubernetes.io/projected/c65466e3-8bac-41f3-855f-202b0a6f9e82-kube-api-access-5gw47\") pod \"nova-scheduler-0\" (UID: \"c65466e3-8bac-41f3-855f-202b0a6f9e82\") " pod="openstack/nova-scheduler-0" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.310012 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wwmwq" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.382728 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62smw\" (UniqueName: \"kubernetes.io/projected/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-kube-api-access-62smw\") pod \"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7\" (UID: \"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7\") " Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.382928 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-scripts\") pod \"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7\" (UID: \"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7\") " Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.382958 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-combined-ca-bundle\") pod \"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7\" (UID: \"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7\") " Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.383347 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-config-data\") pod \"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7\" (UID: \"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7\") " Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.389852 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-scripts" (OuterVolumeSpecName: "scripts") pod "6f23517c-a9a1-4740-8b3b-d42b40cc8bc7" (UID: "6f23517c-a9a1-4740-8b3b-d42b40cc8bc7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.389920 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-kube-api-access-62smw" (OuterVolumeSpecName: "kube-api-access-62smw") pod "6f23517c-a9a1-4740-8b3b-d42b40cc8bc7" (UID: "6f23517c-a9a1-4740-8b3b-d42b40cc8bc7"). InnerVolumeSpecName "kube-api-access-62smw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.393915 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.428174 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f23517c-a9a1-4740-8b3b-d42b40cc8bc7" (UID: "6f23517c-a9a1-4740-8b3b-d42b40cc8bc7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.430850 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-config-data" (OuterVolumeSpecName: "config-data") pod "6f23517c-a9a1-4740-8b3b-d42b40cc8bc7" (UID: "6f23517c-a9a1-4740-8b3b-d42b40cc8bc7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.485813 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.485853 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62smw\" (UniqueName: \"kubernetes.io/projected/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-kube-api-access-62smw\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.485865 4720 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.485873 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.697123 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="684b734f-b6c9-47c9-a8e6-696eb7b0e5d6" path="/var/lib/kubelet/pods/684b734f-b6c9-47c9-a8e6-696eb7b0e5d6/volumes" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.698280 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b5b74e2-e979-488c-a3aa-cdb564e41206" path="/var/lib/kubelet/pods/8b5b74e2-e979-488c-a3aa-cdb564e41206/volumes" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.880852 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.976100 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.994196 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12d3be7d-16ff-43df-a7d5-266f2b1d4308","Type":"ContainerStarted","Data":"00334da872f5838cba633c1df535018745aa753f28eccf9d38d17542b2d83557"} Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.014354 4720 generic.go:334] "Generic (PLEG): container finished" podID="513a3b4c-405a-4045-a76b-acf59f0cfd3a" containerID="1a7aba18286d833bce25e8e85b8d4351c6d249b6433918eb9ba77d5c982b9b45" exitCode=0 Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.014451 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"513a3b4c-405a-4045-a76b-acf59f0cfd3a","Type":"ContainerDied","Data":"1a7aba18286d833bce25e8e85b8d4351c6d249b6433918eb9ba77d5c982b9b45"} Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.014492 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"513a3b4c-405a-4045-a76b-acf59f0cfd3a","Type":"ContainerDied","Data":"8d9596c402475afa19cd2db87682ff4b62a90d083e56dc241ca3ef4472369439"} Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.014528 4720 scope.go:117] "RemoveContainer" containerID="1a7aba18286d833bce25e8e85b8d4351c6d249b6433918eb9ba77d5c982b9b45" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.014625 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.053607 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wwmwq" event={"ID":"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7","Type":"ContainerDied","Data":"2fb036b7bf2aa15771341a929f076f669e5c637e7af2b17fb8de16677e0b5e80"} Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.053644 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fb036b7bf2aa15771341a929f076f669e5c637e7af2b17fb8de16677e0b5e80" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.053719 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wwmwq" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.054506 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 21 14:50:39 crc kubenswrapper[4720]: E0121 14:50:39.055133 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="513a3b4c-405a-4045-a76b-acf59f0cfd3a" containerName="nova-api-log" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.055283 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="513a3b4c-405a-4045-a76b-acf59f0cfd3a" containerName="nova-api-log" Jan 21 14:50:39 crc kubenswrapper[4720]: E0121 14:50:39.055319 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f23517c-a9a1-4740-8b3b-d42b40cc8bc7" containerName="nova-cell1-conductor-db-sync" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.055329 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f23517c-a9a1-4740-8b3b-d42b40cc8bc7" containerName="nova-cell1-conductor-db-sync" Jan 21 14:50:39 crc kubenswrapper[4720]: E0121 14:50:39.055347 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="513a3b4c-405a-4045-a76b-acf59f0cfd3a" containerName="nova-api-api" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.055356 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="513a3b4c-405a-4045-a76b-acf59f0cfd3a" containerName="nova-api-api" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.055938 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="513a3b4c-405a-4045-a76b-acf59f0cfd3a" containerName="nova-api-api" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.055977 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="513a3b4c-405a-4045-a76b-acf59f0cfd3a" containerName="nova-api-log" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.055997 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f23517c-a9a1-4740-8b3b-d42b40cc8bc7" containerName="nova-cell1-conductor-db-sync" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.060706 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.061612 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c65466e3-8bac-41f3-855f-202b0a6f9e82","Type":"ContainerStarted","Data":"6a4188e9bbe7707a1cbd5fc7c33ecb7166835f18ea14b69fcb7fc8e351f09029"} Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.062206 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.066249 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.070930 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cc263e55-641f-47c7-ac02-f863d7cafa11","Type":"ContainerStarted","Data":"2f1103123dd5e075dacd8fa477da83f98abac31eddfad5f03662900eaa3eaf4f"} Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.070971 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cc263e55-641f-47c7-ac02-f863d7cafa11","Type":"ContainerStarted","Data":"9057ae4407090c3f0de53130117d8031d3cff0a8a571aa339b38f2bb43a521c6"} Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.070980 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cc263e55-641f-47c7-ac02-f863d7cafa11","Type":"ContainerStarted","Data":"9bd73af5fd59322a2bd5b4dadb3b5852cd6bfb2cf195e8e11949965c74ef70f1"} Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.104523 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flf6v\" (UniqueName: \"kubernetes.io/projected/513a3b4c-405a-4045-a76b-acf59f0cfd3a-kube-api-access-flf6v\") pod \"513a3b4c-405a-4045-a76b-acf59f0cfd3a\" (UID: \"513a3b4c-405a-4045-a76b-acf59f0cfd3a\") " Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.104923 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/513a3b4c-405a-4045-a76b-acf59f0cfd3a-logs\") pod \"513a3b4c-405a-4045-a76b-acf59f0cfd3a\" (UID: \"513a3b4c-405a-4045-a76b-acf59f0cfd3a\") " Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.105012 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/513a3b4c-405a-4045-a76b-acf59f0cfd3a-combined-ca-bundle\") pod \"513a3b4c-405a-4045-a76b-acf59f0cfd3a\" (UID: \"513a3b4c-405a-4045-a76b-acf59f0cfd3a\") " Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.105269 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/513a3b4c-405a-4045-a76b-acf59f0cfd3a-config-data\") pod \"513a3b4c-405a-4045-a76b-acf59f0cfd3a\" (UID: \"513a3b4c-405a-4045-a76b-acf59f0cfd3a\") " Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.105707 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/679bb64e-c157-415f-9214-0f4e62001f03-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"679bb64e-c157-415f-9214-0f4e62001f03\") " pod="openstack/nova-cell1-conductor-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.105763 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrbcd\" (UniqueName: \"kubernetes.io/projected/679bb64e-c157-415f-9214-0f4e62001f03-kube-api-access-jrbcd\") pod \"nova-cell1-conductor-0\" (UID: \"679bb64e-c157-415f-9214-0f4e62001f03\") " pod="openstack/nova-cell1-conductor-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.105829 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/679bb64e-c157-415f-9214-0f4e62001f03-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"679bb64e-c157-415f-9214-0f4e62001f03\") " pod="openstack/nova-cell1-conductor-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.111829 4720 scope.go:117] "RemoveContainer" containerID="3e9948de7472ed8920759f420aefe5ec7761ed425b16da329167d68730b928c1" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.112404 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/513a3b4c-405a-4045-a76b-acf59f0cfd3a-logs" (OuterVolumeSpecName: "logs") pod "513a3b4c-405a-4045-a76b-acf59f0cfd3a" (UID: "513a3b4c-405a-4045-a76b-acf59f0cfd3a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.119000 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/513a3b4c-405a-4045-a76b-acf59f0cfd3a-kube-api-access-flf6v" (OuterVolumeSpecName: "kube-api-access-flf6v") pod "513a3b4c-405a-4045-a76b-acf59f0cfd3a" (UID: "513a3b4c-405a-4045-a76b-acf59f0cfd3a"). InnerVolumeSpecName "kube-api-access-flf6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.124391 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.124364651 podStartE2EDuration="2.124364651s" podCreationTimestamp="2026-01-21 14:50:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:50:39.117630046 +0000 UTC m=+1277.026369988" watchObservedRunningTime="2026-01-21 14:50:39.124364651 +0000 UTC m=+1277.033104583" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.151570 4720 scope.go:117] "RemoveContainer" containerID="1a7aba18286d833bce25e8e85b8d4351c6d249b6433918eb9ba77d5c982b9b45" Jan 21 14:50:39 crc kubenswrapper[4720]: E0121 14:50:39.152104 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a7aba18286d833bce25e8e85b8d4351c6d249b6433918eb9ba77d5c982b9b45\": container with ID starting with 1a7aba18286d833bce25e8e85b8d4351c6d249b6433918eb9ba77d5c982b9b45 not found: ID does not exist" containerID="1a7aba18286d833bce25e8e85b8d4351c6d249b6433918eb9ba77d5c982b9b45" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.152150 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a7aba18286d833bce25e8e85b8d4351c6d249b6433918eb9ba77d5c982b9b45"} err="failed to get container status \"1a7aba18286d833bce25e8e85b8d4351c6d249b6433918eb9ba77d5c982b9b45\": rpc error: code = NotFound desc = could not find container \"1a7aba18286d833bce25e8e85b8d4351c6d249b6433918eb9ba77d5c982b9b45\": container with ID starting with 1a7aba18286d833bce25e8e85b8d4351c6d249b6433918eb9ba77d5c982b9b45 not found: ID does not exist" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.152170 4720 scope.go:117] "RemoveContainer" containerID="3e9948de7472ed8920759f420aefe5ec7761ed425b16da329167d68730b928c1" Jan 21 14:50:39 crc kubenswrapper[4720]: E0121 14:50:39.152409 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e9948de7472ed8920759f420aefe5ec7761ed425b16da329167d68730b928c1\": container with ID starting with 3e9948de7472ed8920759f420aefe5ec7761ed425b16da329167d68730b928c1 not found: ID does not exist" containerID="3e9948de7472ed8920759f420aefe5ec7761ed425b16da329167d68730b928c1" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.152427 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e9948de7472ed8920759f420aefe5ec7761ed425b16da329167d68730b928c1"} err="failed to get container status \"3e9948de7472ed8920759f420aefe5ec7761ed425b16da329167d68730b928c1\": rpc error: code = NotFound desc = could not find container \"3e9948de7472ed8920759f420aefe5ec7761ed425b16da329167d68730b928c1\": container with ID starting with 3e9948de7472ed8920759f420aefe5ec7761ed425b16da329167d68730b928c1 not found: ID does not exist" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.173771 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/513a3b4c-405a-4045-a76b-acf59f0cfd3a-config-data" (OuterVolumeSpecName: "config-data") pod "513a3b4c-405a-4045-a76b-acf59f0cfd3a" (UID: "513a3b4c-405a-4045-a76b-acf59f0cfd3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.180845 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/513a3b4c-405a-4045-a76b-acf59f0cfd3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "513a3b4c-405a-4045-a76b-acf59f0cfd3a" (UID: "513a3b4c-405a-4045-a76b-acf59f0cfd3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.207929 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/679bb64e-c157-415f-9214-0f4e62001f03-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"679bb64e-c157-415f-9214-0f4e62001f03\") " pod="openstack/nova-cell1-conductor-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.207994 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrbcd\" (UniqueName: \"kubernetes.io/projected/679bb64e-c157-415f-9214-0f4e62001f03-kube-api-access-jrbcd\") pod \"nova-cell1-conductor-0\" (UID: \"679bb64e-c157-415f-9214-0f4e62001f03\") " pod="openstack/nova-cell1-conductor-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.208050 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/679bb64e-c157-415f-9214-0f4e62001f03-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"679bb64e-c157-415f-9214-0f4e62001f03\") " pod="openstack/nova-cell1-conductor-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.208199 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/513a3b4c-405a-4045-a76b-acf59f0cfd3a-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.208215 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flf6v\" (UniqueName: \"kubernetes.io/projected/513a3b4c-405a-4045-a76b-acf59f0cfd3a-kube-api-access-flf6v\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.208228 4720 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/513a3b4c-405a-4045-a76b-acf59f0cfd3a-logs\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.208261 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/513a3b4c-405a-4045-a76b-acf59f0cfd3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.211784 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/679bb64e-c157-415f-9214-0f4e62001f03-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"679bb64e-c157-415f-9214-0f4e62001f03\") " pod="openstack/nova-cell1-conductor-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.217226 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/679bb64e-c157-415f-9214-0f4e62001f03-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"679bb64e-c157-415f-9214-0f4e62001f03\") " pod="openstack/nova-cell1-conductor-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.224063 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrbcd\" (UniqueName: \"kubernetes.io/projected/679bb64e-c157-415f-9214-0f4e62001f03-kube-api-access-jrbcd\") pod \"nova-cell1-conductor-0\" (UID: \"679bb64e-c157-415f-9214-0f4e62001f03\") " pod="openstack/nova-cell1-conductor-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.249756 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.398535 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.411152 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.461425 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.485996 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.487608 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.490140 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.512197 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.642574 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a072861f-6e44-4b30-8666-7dc9b0e2078e-logs\") pod \"nova-api-0\" (UID: \"a072861f-6e44-4b30-8666-7dc9b0e2078e\") " pod="openstack/nova-api-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.642642 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxg24\" (UniqueName: \"kubernetes.io/projected/a072861f-6e44-4b30-8666-7dc9b0e2078e-kube-api-access-vxg24\") pod \"nova-api-0\" (UID: \"a072861f-6e44-4b30-8666-7dc9b0e2078e\") " pod="openstack/nova-api-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.642677 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a072861f-6e44-4b30-8666-7dc9b0e2078e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a072861f-6e44-4b30-8666-7dc9b0e2078e\") " pod="openstack/nova-api-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.642715 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a072861f-6e44-4b30-8666-7dc9b0e2078e-config-data\") pod \"nova-api-0\" (UID: \"a072861f-6e44-4b30-8666-7dc9b0e2078e\") " pod="openstack/nova-api-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.745742 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a072861f-6e44-4b30-8666-7dc9b0e2078e-config-data\") pod \"nova-api-0\" (UID: \"a072861f-6e44-4b30-8666-7dc9b0e2078e\") " pod="openstack/nova-api-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.745899 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a072861f-6e44-4b30-8666-7dc9b0e2078e-logs\") pod \"nova-api-0\" (UID: \"a072861f-6e44-4b30-8666-7dc9b0e2078e\") " pod="openstack/nova-api-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.745964 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxg24\" (UniqueName: \"kubernetes.io/projected/a072861f-6e44-4b30-8666-7dc9b0e2078e-kube-api-access-vxg24\") pod \"nova-api-0\" (UID: \"a072861f-6e44-4b30-8666-7dc9b0e2078e\") " pod="openstack/nova-api-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.745991 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a072861f-6e44-4b30-8666-7dc9b0e2078e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a072861f-6e44-4b30-8666-7dc9b0e2078e\") " pod="openstack/nova-api-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.747162 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a072861f-6e44-4b30-8666-7dc9b0e2078e-logs\") pod \"nova-api-0\" (UID: \"a072861f-6e44-4b30-8666-7dc9b0e2078e\") " pod="openstack/nova-api-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.755536 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a072861f-6e44-4b30-8666-7dc9b0e2078e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a072861f-6e44-4b30-8666-7dc9b0e2078e\") " pod="openstack/nova-api-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.766138 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a072861f-6e44-4b30-8666-7dc9b0e2078e-config-data\") pod \"nova-api-0\" (UID: \"a072861f-6e44-4b30-8666-7dc9b0e2078e\") " pod="openstack/nova-api-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.769510 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxg24\" (UniqueName: \"kubernetes.io/projected/a072861f-6e44-4b30-8666-7dc9b0e2078e-kube-api-access-vxg24\") pod \"nova-api-0\" (UID: \"a072861f-6e44-4b30-8666-7dc9b0e2078e\") " pod="openstack/nova-api-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.812780 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.926256 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 21 14:50:40 crc kubenswrapper[4720]: I0121 14:50:40.084558 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c65466e3-8bac-41f3-855f-202b0a6f9e82","Type":"ContainerStarted","Data":"0229617962c429f7fcb15b291fd09407ae1937ea495d5cde0e4d779fd887ce9f"} Jan 21 14:50:40 crc kubenswrapper[4720]: I0121 14:50:40.093239 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12d3be7d-16ff-43df-a7d5-266f2b1d4308","Type":"ContainerStarted","Data":"e559aa0454450a58853ddf4a50d570c328194ddcfdc0c3464369f91c9c21dee1"} Jan 21 14:50:40 crc kubenswrapper[4720]: I0121 14:50:40.093865 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 21 14:50:40 crc kubenswrapper[4720]: I0121 14:50:40.095706 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"679bb64e-c157-415f-9214-0f4e62001f03","Type":"ContainerStarted","Data":"d2faa52133bae0219c5f601518ddac5869d940e38bedf4585888fa7d17866164"} Jan 21 14:50:40 crc kubenswrapper[4720]: I0121 14:50:40.111398 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.111376061 podStartE2EDuration="3.111376061s" podCreationTimestamp="2026-01-21 14:50:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:50:40.101227463 +0000 UTC m=+1278.009967405" watchObservedRunningTime="2026-01-21 14:50:40.111376061 +0000 UTC m=+1278.020115993" Jan 21 14:50:40 crc kubenswrapper[4720]: I0121 14:50:40.161613 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.732283312 podStartE2EDuration="6.161596413s" podCreationTimestamp="2026-01-21 14:50:34 +0000 UTC" firstStartedPulling="2026-01-21 14:50:35.805870437 +0000 UTC m=+1273.714610379" lastFinishedPulling="2026-01-21 14:50:39.235183548 +0000 UTC m=+1277.143923480" observedRunningTime="2026-01-21 14:50:40.121483688 +0000 UTC m=+1278.030223620" watchObservedRunningTime="2026-01-21 14:50:40.161596413 +0000 UTC m=+1278.070336345" Jan 21 14:50:40 crc kubenswrapper[4720]: I0121 14:50:40.309931 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 21 14:50:40 crc kubenswrapper[4720]: I0121 14:50:40.687980 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="513a3b4c-405a-4045-a76b-acf59f0cfd3a" path="/var/lib/kubelet/pods/513a3b4c-405a-4045-a76b-acf59f0cfd3a/volumes" Jan 21 14:50:41 crc kubenswrapper[4720]: I0121 14:50:41.109760 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a072861f-6e44-4b30-8666-7dc9b0e2078e","Type":"ContainerStarted","Data":"58f63d184705ab6432d547aaa2cb911c01794d62ad6af01203240526c283f778"} Jan 21 14:50:41 crc kubenswrapper[4720]: I0121 14:50:41.109816 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a072861f-6e44-4b30-8666-7dc9b0e2078e","Type":"ContainerStarted","Data":"970bcf65e5a09e77255def09f218971569442aab8e3ec98881ef4af0f5c9e750"} Jan 21 14:50:41 crc kubenswrapper[4720]: I0121 14:50:41.109832 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a072861f-6e44-4b30-8666-7dc9b0e2078e","Type":"ContainerStarted","Data":"1ac2709f0fcedf3f81608ca1a0f69ad5080c7f047fe98d3ffad1aa7ecce36ad0"} Jan 21 14:50:41 crc kubenswrapper[4720]: I0121 14:50:41.111956 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"679bb64e-c157-415f-9214-0f4e62001f03","Type":"ContainerStarted","Data":"dac570b967201ec13bde624a0f28edde829ebf2e73a8c0ad20002188376c9b7b"} Jan 21 14:50:41 crc kubenswrapper[4720]: I0121 14:50:41.130774 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.130754691 podStartE2EDuration="2.130754691s" podCreationTimestamp="2026-01-21 14:50:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:50:41.125707787 +0000 UTC m=+1279.034447729" watchObservedRunningTime="2026-01-21 14:50:41.130754691 +0000 UTC m=+1279.039494623" Jan 21 14:50:42 crc kubenswrapper[4720]: I0121 14:50:42.119593 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Jan 21 14:50:42 crc kubenswrapper[4720]: I0121 14:50:42.617818 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 21 14:50:42 crc kubenswrapper[4720]: I0121 14:50:42.617871 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 21 14:50:43 crc kubenswrapper[4720]: I0121 14:50:43.394300 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 21 14:50:43 crc kubenswrapper[4720]: I0121 14:50:43.964895 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=4.96463424 podStartE2EDuration="4.96463424s" podCreationTimestamp="2026-01-21 14:50:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:50:41.158101399 +0000 UTC m=+1279.066841331" watchObservedRunningTime="2026-01-21 14:50:43.96463424 +0000 UTC m=+1281.873374202" Jan 21 14:50:43 crc kubenswrapper[4720]: I0121 14:50:43.965227 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-prdsm"] Jan 21 14:50:43 crc kubenswrapper[4720]: I0121 14:50:43.967076 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-prdsm" Jan 21 14:50:43 crc kubenswrapper[4720]: I0121 14:50:43.985354 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-prdsm"] Jan 21 14:50:44 crc kubenswrapper[4720]: I0121 14:50:44.035968 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5825e26f-385a-4384-a0e6-18a04e49ddf7-utilities\") pod \"redhat-operators-prdsm\" (UID: \"5825e26f-385a-4384-a0e6-18a04e49ddf7\") " pod="openshift-marketplace/redhat-operators-prdsm" Jan 21 14:50:44 crc kubenswrapper[4720]: I0121 14:50:44.036181 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5825e26f-385a-4384-a0e6-18a04e49ddf7-catalog-content\") pod \"redhat-operators-prdsm\" (UID: \"5825e26f-385a-4384-a0e6-18a04e49ddf7\") " pod="openshift-marketplace/redhat-operators-prdsm" Jan 21 14:50:44 crc kubenswrapper[4720]: I0121 14:50:44.036284 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mnj5\" (UniqueName: \"kubernetes.io/projected/5825e26f-385a-4384-a0e6-18a04e49ddf7-kube-api-access-5mnj5\") pod \"redhat-operators-prdsm\" (UID: \"5825e26f-385a-4384-a0e6-18a04e49ddf7\") " pod="openshift-marketplace/redhat-operators-prdsm" Jan 21 14:50:44 crc kubenswrapper[4720]: I0121 14:50:44.138822 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5825e26f-385a-4384-a0e6-18a04e49ddf7-utilities\") pod \"redhat-operators-prdsm\" (UID: \"5825e26f-385a-4384-a0e6-18a04e49ddf7\") " pod="openshift-marketplace/redhat-operators-prdsm" Jan 21 14:50:44 crc kubenswrapper[4720]: I0121 14:50:44.138905 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5825e26f-385a-4384-a0e6-18a04e49ddf7-catalog-content\") pod \"redhat-operators-prdsm\" (UID: \"5825e26f-385a-4384-a0e6-18a04e49ddf7\") " pod="openshift-marketplace/redhat-operators-prdsm" Jan 21 14:50:44 crc kubenswrapper[4720]: I0121 14:50:44.138938 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mnj5\" (UniqueName: \"kubernetes.io/projected/5825e26f-385a-4384-a0e6-18a04e49ddf7-kube-api-access-5mnj5\") pod \"redhat-operators-prdsm\" (UID: \"5825e26f-385a-4384-a0e6-18a04e49ddf7\") " pod="openshift-marketplace/redhat-operators-prdsm" Jan 21 14:50:44 crc kubenswrapper[4720]: I0121 14:50:44.139419 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5825e26f-385a-4384-a0e6-18a04e49ddf7-utilities\") pod \"redhat-operators-prdsm\" (UID: \"5825e26f-385a-4384-a0e6-18a04e49ddf7\") " pod="openshift-marketplace/redhat-operators-prdsm" Jan 21 14:50:44 crc kubenswrapper[4720]: I0121 14:50:44.139803 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5825e26f-385a-4384-a0e6-18a04e49ddf7-catalog-content\") pod \"redhat-operators-prdsm\" (UID: \"5825e26f-385a-4384-a0e6-18a04e49ddf7\") " pod="openshift-marketplace/redhat-operators-prdsm" Jan 21 14:50:44 crc kubenswrapper[4720]: I0121 14:50:44.159460 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mnj5\" (UniqueName: \"kubernetes.io/projected/5825e26f-385a-4384-a0e6-18a04e49ddf7-kube-api-access-5mnj5\") pod \"redhat-operators-prdsm\" (UID: \"5825e26f-385a-4384-a0e6-18a04e49ddf7\") " pod="openshift-marketplace/redhat-operators-prdsm" Jan 21 14:50:44 crc kubenswrapper[4720]: I0121 14:50:44.298626 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-prdsm" Jan 21 14:50:44 crc kubenswrapper[4720]: I0121 14:50:44.781339 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-prdsm"] Jan 21 14:50:45 crc kubenswrapper[4720]: I0121 14:50:45.144609 4720 generic.go:334] "Generic (PLEG): container finished" podID="5825e26f-385a-4384-a0e6-18a04e49ddf7" containerID="8c7954ba9a69f367fe5dd10c0acf22cad0338ae74bc42c2b1b294122ddc99414" exitCode=0 Jan 21 14:50:45 crc kubenswrapper[4720]: I0121 14:50:45.144781 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prdsm" event={"ID":"5825e26f-385a-4384-a0e6-18a04e49ddf7","Type":"ContainerDied","Data":"8c7954ba9a69f367fe5dd10c0acf22cad0338ae74bc42c2b1b294122ddc99414"} Jan 21 14:50:45 crc kubenswrapper[4720]: I0121 14:50:45.145019 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prdsm" event={"ID":"5825e26f-385a-4384-a0e6-18a04e49ddf7","Type":"ContainerStarted","Data":"abc55b9d285c58116da7b148c4e87092b6925a74a8b7d4e6ff71e53eb61cdc76"} Jan 21 14:50:47 crc kubenswrapper[4720]: I0121 14:50:47.162876 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prdsm" event={"ID":"5825e26f-385a-4384-a0e6-18a04e49ddf7","Type":"ContainerStarted","Data":"2afc82d2610bfc44ba979738c0ceeef7f7d90c62e91e764f5284e10ea48d21ab"} Jan 21 14:50:47 crc kubenswrapper[4720]: I0121 14:50:47.618325 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 21 14:50:47 crc kubenswrapper[4720]: I0121 14:50:47.618373 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 21 14:50:48 crc kubenswrapper[4720]: I0121 14:50:48.394992 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 21 14:50:48 crc kubenswrapper[4720]: I0121 14:50:48.418589 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 21 14:50:48 crc kubenswrapper[4720]: I0121 14:50:48.631899 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="cc263e55-641f-47c7-ac02-f863d7cafa11" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.178:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 14:50:48 crc kubenswrapper[4720]: I0121 14:50:48.631931 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="cc263e55-641f-47c7-ac02-f863d7cafa11" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.178:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 14:50:49 crc kubenswrapper[4720]: I0121 14:50:49.206882 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 21 14:50:49 crc kubenswrapper[4720]: I0121 14:50:49.440460 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Jan 21 14:50:49 crc kubenswrapper[4720]: I0121 14:50:49.813675 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 21 14:50:49 crc kubenswrapper[4720]: I0121 14:50:49.813720 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 21 14:50:50 crc kubenswrapper[4720]: I0121 14:50:50.895886 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a072861f-6e44-4b30-8666-7dc9b0e2078e" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.181:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 14:50:50 crc kubenswrapper[4720]: I0121 14:50:50.895887 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a072861f-6e44-4b30-8666-7dc9b0e2078e" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.181:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 14:50:51 crc kubenswrapper[4720]: I0121 14:50:51.197257 4720 generic.go:334] "Generic (PLEG): container finished" podID="5825e26f-385a-4384-a0e6-18a04e49ddf7" containerID="2afc82d2610bfc44ba979738c0ceeef7f7d90c62e91e764f5284e10ea48d21ab" exitCode=0 Jan 21 14:50:51 crc kubenswrapper[4720]: I0121 14:50:51.197329 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prdsm" event={"ID":"5825e26f-385a-4384-a0e6-18a04e49ddf7","Type":"ContainerDied","Data":"2afc82d2610bfc44ba979738c0ceeef7f7d90c62e91e764f5284e10ea48d21ab"} Jan 21 14:50:53 crc kubenswrapper[4720]: I0121 14:50:53.223780 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prdsm" event={"ID":"5825e26f-385a-4384-a0e6-18a04e49ddf7","Type":"ContainerStarted","Data":"0a9a0f7c5df4c76c0caa2f798f4b2308333087ebb4d4bf1855716ce187fae8d3"} Jan 21 14:50:53 crc kubenswrapper[4720]: I0121 14:50:53.246871 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-prdsm" podStartSLOduration=2.98547359 podStartE2EDuration="10.246851185s" podCreationTimestamp="2026-01-21 14:50:43 +0000 UTC" firstStartedPulling="2026-01-21 14:50:45.146444516 +0000 UTC m=+1283.055184448" lastFinishedPulling="2026-01-21 14:50:52.407822071 +0000 UTC m=+1290.316562043" observedRunningTime="2026-01-21 14:50:53.245864126 +0000 UTC m=+1291.154604098" watchObservedRunningTime="2026-01-21 14:50:53.246851185 +0000 UTC m=+1291.155591127" Jan 21 14:50:54 crc kubenswrapper[4720]: I0121 14:50:54.300247 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-prdsm" Jan 21 14:50:54 crc kubenswrapper[4720]: I0121 14:50:54.301063 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-prdsm" Jan 21 14:50:55 crc kubenswrapper[4720]: I0121 14:50:55.355280 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-prdsm" podUID="5825e26f-385a-4384-a0e6-18a04e49ddf7" containerName="registry-server" probeResult="failure" output=< Jan 21 14:50:55 crc kubenswrapper[4720]: timeout: failed to connect service ":50051" within 1s Jan 21 14:50:55 crc kubenswrapper[4720]: > Jan 21 14:50:57 crc kubenswrapper[4720]: I0121 14:50:57.635934 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 21 14:50:57 crc kubenswrapper[4720]: I0121 14:50:57.636332 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 21 14:50:57 crc kubenswrapper[4720]: I0121 14:50:57.680207 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 21 14:50:57 crc kubenswrapper[4720]: I0121 14:50:57.682009 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 21 14:50:59 crc kubenswrapper[4720]: I0121 14:50:59.816931 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 21 14:50:59 crc kubenswrapper[4720]: I0121 14:50:59.817215 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 21 14:50:59 crc kubenswrapper[4720]: I0121 14:50:59.817492 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 21 14:50:59 crc kubenswrapper[4720]: I0121 14:50:59.818101 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 21 14:50:59 crc kubenswrapper[4720]: I0121 14:50:59.829832 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 21 14:50:59 crc kubenswrapper[4720]: I0121 14:50:59.835287 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.061495 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-9p6zm"] Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.063640 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.083667 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-9p6zm"] Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.144802 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-config\") pod \"dnsmasq-dns-5b856c5697-9p6zm\" (UID: \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\") " pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.144837 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-9p6zm\" (UID: \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\") " pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.144903 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlfx2\" (UniqueName: \"kubernetes.io/projected/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-kube-api-access-wlfx2\") pod \"dnsmasq-dns-5b856c5697-9p6zm\" (UID: \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\") " pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.145017 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-dns-svc\") pod \"dnsmasq-dns-5b856c5697-9p6zm\" (UID: \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\") " pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.145050 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-9p6zm\" (UID: \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\") " pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.246545 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-dns-svc\") pod \"dnsmasq-dns-5b856c5697-9p6zm\" (UID: \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\") " pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.246608 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-9p6zm\" (UID: \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\") " pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.246645 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-config\") pod \"dnsmasq-dns-5b856c5697-9p6zm\" (UID: \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\") " pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.246676 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-9p6zm\" (UID: \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\") " pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.246741 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlfx2\" (UniqueName: \"kubernetes.io/projected/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-kube-api-access-wlfx2\") pod \"dnsmasq-dns-5b856c5697-9p6zm\" (UID: \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\") " pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.247711 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-dns-svc\") pod \"dnsmasq-dns-5b856c5697-9p6zm\" (UID: \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\") " pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.247758 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-config\") pod \"dnsmasq-dns-5b856c5697-9p6zm\" (UID: \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\") " pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.248367 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-9p6zm\" (UID: \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\") " pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.248481 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-9p6zm\" (UID: \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\") " pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.265910 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlfx2\" (UniqueName: \"kubernetes.io/projected/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-kube-api-access-wlfx2\") pod \"dnsmasq-dns-5b856c5697-9p6zm\" (UID: \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\") " pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.288055 4720 generic.go:334] "Generic (PLEG): container finished" podID="05605fa3-fac7-4375-8a3b-ff90d2664098" containerID="43256d114f7b72d2ce26d562115a7a1fc28bb5530d0b1203ac1a95fee0c62437" exitCode=137 Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.289794 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"05605fa3-fac7-4375-8a3b-ff90d2664098","Type":"ContainerDied","Data":"43256d114f7b72d2ce26d562115a7a1fc28bb5530d0b1203ac1a95fee0c62437"} Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.473393 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.627174 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.759155 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05605fa3-fac7-4375-8a3b-ff90d2664098-combined-ca-bundle\") pod \"05605fa3-fac7-4375-8a3b-ff90d2664098\" (UID: \"05605fa3-fac7-4375-8a3b-ff90d2664098\") " Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.759219 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpb4l\" (UniqueName: \"kubernetes.io/projected/05605fa3-fac7-4375-8a3b-ff90d2664098-kube-api-access-tpb4l\") pod \"05605fa3-fac7-4375-8a3b-ff90d2664098\" (UID: \"05605fa3-fac7-4375-8a3b-ff90d2664098\") " Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.759301 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05605fa3-fac7-4375-8a3b-ff90d2664098-config-data\") pod \"05605fa3-fac7-4375-8a3b-ff90d2664098\" (UID: \"05605fa3-fac7-4375-8a3b-ff90d2664098\") " Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.773033 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05605fa3-fac7-4375-8a3b-ff90d2664098-kube-api-access-tpb4l" (OuterVolumeSpecName: "kube-api-access-tpb4l") pod "05605fa3-fac7-4375-8a3b-ff90d2664098" (UID: "05605fa3-fac7-4375-8a3b-ff90d2664098"). InnerVolumeSpecName "kube-api-access-tpb4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.794490 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05605fa3-fac7-4375-8a3b-ff90d2664098-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05605fa3-fac7-4375-8a3b-ff90d2664098" (UID: "05605fa3-fac7-4375-8a3b-ff90d2664098"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.819000 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05605fa3-fac7-4375-8a3b-ff90d2664098-config-data" (OuterVolumeSpecName: "config-data") pod "05605fa3-fac7-4375-8a3b-ff90d2664098" (UID: "05605fa3-fac7-4375-8a3b-ff90d2664098"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.860914 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpb4l\" (UniqueName: \"kubernetes.io/projected/05605fa3-fac7-4375-8a3b-ff90d2664098-kube-api-access-tpb4l\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.861162 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05605fa3-fac7-4375-8a3b-ff90d2664098-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.861253 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05605fa3-fac7-4375-8a3b-ff90d2664098-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.048758 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-9p6zm"] Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.297235 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.298346 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"05605fa3-fac7-4375-8a3b-ff90d2664098","Type":"ContainerDied","Data":"288c85dff47a79ec2a1b499393b40b7854d1dcf0eb1a7514afc8487559facb57"} Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.298411 4720 scope.go:117] "RemoveContainer" containerID="43256d114f7b72d2ce26d562115a7a1fc28bb5530d0b1203ac1a95fee0c62437" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.300685 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" event={"ID":"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6","Type":"ContainerStarted","Data":"62714c9a0f1a3b425c21ab81569bf9c4c0ba1448aea15537467fba81fe36bdf5"} Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.334231 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.339446 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.390422 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 21 14:51:01 crc kubenswrapper[4720]: E0121 14:51:01.391731 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05605fa3-fac7-4375-8a3b-ff90d2664098" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.391789 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="05605fa3-fac7-4375-8a3b-ff90d2664098" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.400883 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="05605fa3-fac7-4375-8a3b-ff90d2664098" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.402184 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.404819 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.404853 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.405154 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.410265 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.477350 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ea3e3dd-0e39-4a28-9112-27f0874af221-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5ea3e3dd-0e39-4a28-9112-27f0874af221\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.477403 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ea3e3dd-0e39-4a28-9112-27f0874af221-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5ea3e3dd-0e39-4a28-9112-27f0874af221\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.477463 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngm4w\" (UniqueName: \"kubernetes.io/projected/5ea3e3dd-0e39-4a28-9112-27f0874af221-kube-api-access-ngm4w\") pod \"nova-cell1-novncproxy-0\" (UID: \"5ea3e3dd-0e39-4a28-9112-27f0874af221\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.477494 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ea3e3dd-0e39-4a28-9112-27f0874af221-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5ea3e3dd-0e39-4a28-9112-27f0874af221\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.477557 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ea3e3dd-0e39-4a28-9112-27f0874af221-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5ea3e3dd-0e39-4a28-9112-27f0874af221\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.579722 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ea3e3dd-0e39-4a28-9112-27f0874af221-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5ea3e3dd-0e39-4a28-9112-27f0874af221\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.580047 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ea3e3dd-0e39-4a28-9112-27f0874af221-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5ea3e3dd-0e39-4a28-9112-27f0874af221\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.580242 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngm4w\" (UniqueName: \"kubernetes.io/projected/5ea3e3dd-0e39-4a28-9112-27f0874af221-kube-api-access-ngm4w\") pod \"nova-cell1-novncproxy-0\" (UID: \"5ea3e3dd-0e39-4a28-9112-27f0874af221\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.580335 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ea3e3dd-0e39-4a28-9112-27f0874af221-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5ea3e3dd-0e39-4a28-9112-27f0874af221\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.580490 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ea3e3dd-0e39-4a28-9112-27f0874af221-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5ea3e3dd-0e39-4a28-9112-27f0874af221\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.585089 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ea3e3dd-0e39-4a28-9112-27f0874af221-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5ea3e3dd-0e39-4a28-9112-27f0874af221\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.585240 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ea3e3dd-0e39-4a28-9112-27f0874af221-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5ea3e3dd-0e39-4a28-9112-27f0874af221\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.585622 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ea3e3dd-0e39-4a28-9112-27f0874af221-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5ea3e3dd-0e39-4a28-9112-27f0874af221\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.592185 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ea3e3dd-0e39-4a28-9112-27f0874af221-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5ea3e3dd-0e39-4a28-9112-27f0874af221\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.600115 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngm4w\" (UniqueName: \"kubernetes.io/projected/5ea3e3dd-0e39-4a28-9112-27f0874af221-kube-api-access-ngm4w\") pod \"nova-cell1-novncproxy-0\" (UID: \"5ea3e3dd-0e39-4a28-9112-27f0874af221\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.720175 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:02 crc kubenswrapper[4720]: I0121 14:51:02.181783 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 21 14:51:02 crc kubenswrapper[4720]: I0121 14:51:02.314605 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5ea3e3dd-0e39-4a28-9112-27f0874af221","Type":"ContainerStarted","Data":"c656f4a85b4359ec9b810d9745ead588aa464c8e0e2def306f5ade53cbe97c15"} Jan 21 14:51:02 crc kubenswrapper[4720]: I0121 14:51:02.693314 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05605fa3-fac7-4375-8a3b-ff90d2664098" path="/var/lib/kubelet/pods/05605fa3-fac7-4375-8a3b-ff90d2664098/volumes" Jan 21 14:51:02 crc kubenswrapper[4720]: I0121 14:51:02.780999 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 21 14:51:02 crc kubenswrapper[4720]: I0121 14:51:02.781261 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a072861f-6e44-4b30-8666-7dc9b0e2078e" containerName="nova-api-log" containerID="cri-o://970bcf65e5a09e77255def09f218971569442aab8e3ec98881ef4af0f5c9e750" gracePeriod=30 Jan 21 14:51:02 crc kubenswrapper[4720]: I0121 14:51:02.781330 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a072861f-6e44-4b30-8666-7dc9b0e2078e" containerName="nova-api-api" containerID="cri-o://58f63d184705ab6432d547aaa2cb911c01794d62ad6af01203240526c283f778" gracePeriod=30 Jan 21 14:51:03 crc kubenswrapper[4720]: I0121 14:51:03.324162 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" event={"ID":"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6","Type":"ContainerStarted","Data":"8d26a926c7513c277eda02616206b8d63ce1e5af78c608a35fda9052efcedc61"} Jan 21 14:51:04 crc kubenswrapper[4720]: I0121 14:51:04.336599 4720 generic.go:334] "Generic (PLEG): container finished" podID="a072861f-6e44-4b30-8666-7dc9b0e2078e" containerID="970bcf65e5a09e77255def09f218971569442aab8e3ec98881ef4af0f5c9e750" exitCode=143 Jan 21 14:51:04 crc kubenswrapper[4720]: I0121 14:51:04.336713 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a072861f-6e44-4b30-8666-7dc9b0e2078e","Type":"ContainerDied","Data":"970bcf65e5a09e77255def09f218971569442aab8e3ec98881ef4af0f5c9e750"} Jan 21 14:51:04 crc kubenswrapper[4720]: I0121 14:51:04.340585 4720 generic.go:334] "Generic (PLEG): container finished" podID="75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6" containerID="8d26a926c7513c277eda02616206b8d63ce1e5af78c608a35fda9052efcedc61" exitCode=0 Jan 21 14:51:04 crc kubenswrapper[4720]: I0121 14:51:04.340782 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" event={"ID":"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6","Type":"ContainerDied","Data":"8d26a926c7513c277eda02616206b8d63ce1e5af78c608a35fda9052efcedc61"} Jan 21 14:51:04 crc kubenswrapper[4720]: I0121 14:51:04.342730 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5ea3e3dd-0e39-4a28-9112-27f0874af221","Type":"ContainerStarted","Data":"2f03ec3393a2878809a279f895ea20586e7f009c9c39727ea085c5a0d12d7584"} Jan 21 14:51:04 crc kubenswrapper[4720]: I0121 14:51:04.403300 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.403279987 podStartE2EDuration="3.403279987s" podCreationTimestamp="2026-01-21 14:51:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:51:04.389756745 +0000 UTC m=+1302.298496677" watchObservedRunningTime="2026-01-21 14:51:04.403279987 +0000 UTC m=+1302.312019919" Jan 21 14:51:04 crc kubenswrapper[4720]: I0121 14:51:04.628867 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:51:04 crc kubenswrapper[4720]: I0121 14:51:04.629426 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerName="ceilometer-central-agent" containerID="cri-o://5d5bd9374ec63c57d2fd3d7df27f609498666bfefedad7b65b9e20757f2269d6" gracePeriod=30 Jan 21 14:51:04 crc kubenswrapper[4720]: I0121 14:51:04.629559 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerName="proxy-httpd" containerID="cri-o://e559aa0454450a58853ddf4a50d570c328194ddcfdc0c3464369f91c9c21dee1" gracePeriod=30 Jan 21 14:51:04 crc kubenswrapper[4720]: I0121 14:51:04.629601 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerName="sg-core" containerID="cri-o://00334da872f5838cba633c1df535018745aa753f28eccf9d38d17542b2d83557" gracePeriod=30 Jan 21 14:51:04 crc kubenswrapper[4720]: I0121 14:51:04.629638 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerName="ceilometer-notification-agent" containerID="cri-o://2e4657d7e4e72bd0317d9a4097deb7850089e8ed3ac24d33b7e7cfaf4e9621ab" gracePeriod=30 Jan 21 14:51:04 crc kubenswrapper[4720]: I0121 14:51:04.737529 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.177:3000/\": read tcp 10.217.0.2:39390->10.217.0.177:3000: read: connection reset by peer" Jan 21 14:51:05 crc kubenswrapper[4720]: E0121 14:51:05.001158 4720 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12d3be7d_16ff_43df_a7d5_266f2b1d4308.slice/crio-conmon-e559aa0454450a58853ddf4a50d570c328194ddcfdc0c3464369f91c9c21dee1.scope\": RecentStats: unable to find data in memory cache]" Jan 21 14:51:05 crc kubenswrapper[4720]: I0121 14:51:05.280995 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.177:3000/\": dial tcp 10.217.0.177:3000: connect: connection refused" Jan 21 14:51:05 crc kubenswrapper[4720]: I0121 14:51:05.347941 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-prdsm" podUID="5825e26f-385a-4384-a0e6-18a04e49ddf7" containerName="registry-server" probeResult="failure" output=< Jan 21 14:51:05 crc kubenswrapper[4720]: timeout: failed to connect service ":50051" within 1s Jan 21 14:51:05 crc kubenswrapper[4720]: > Jan 21 14:51:05 crc kubenswrapper[4720]: I0121 14:51:05.361438 4720 generic.go:334] "Generic (PLEG): container finished" podID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerID="e559aa0454450a58853ddf4a50d570c328194ddcfdc0c3464369f91c9c21dee1" exitCode=0 Jan 21 14:51:05 crc kubenswrapper[4720]: I0121 14:51:05.361481 4720 generic.go:334] "Generic (PLEG): container finished" podID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerID="00334da872f5838cba633c1df535018745aa753f28eccf9d38d17542b2d83557" exitCode=2 Jan 21 14:51:05 crc kubenswrapper[4720]: I0121 14:51:05.361493 4720 generic.go:334] "Generic (PLEG): container finished" podID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerID="5d5bd9374ec63c57d2fd3d7df27f609498666bfefedad7b65b9e20757f2269d6" exitCode=0 Jan 21 14:51:05 crc kubenswrapper[4720]: I0121 14:51:05.361533 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12d3be7d-16ff-43df-a7d5-266f2b1d4308","Type":"ContainerDied","Data":"e559aa0454450a58853ddf4a50d570c328194ddcfdc0c3464369f91c9c21dee1"} Jan 21 14:51:05 crc kubenswrapper[4720]: I0121 14:51:05.361561 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12d3be7d-16ff-43df-a7d5-266f2b1d4308","Type":"ContainerDied","Data":"00334da872f5838cba633c1df535018745aa753f28eccf9d38d17542b2d83557"} Jan 21 14:51:05 crc kubenswrapper[4720]: I0121 14:51:05.361573 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12d3be7d-16ff-43df-a7d5-266f2b1d4308","Type":"ContainerDied","Data":"5d5bd9374ec63c57d2fd3d7df27f609498666bfefedad7b65b9e20757f2269d6"} Jan 21 14:51:05 crc kubenswrapper[4720]: I0121 14:51:05.369843 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" event={"ID":"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6","Type":"ContainerStarted","Data":"1715d91e262791470c33fe547bc682331a4067f0c9ab327b2b7d0a3b75411203"} Jan 21 14:51:05 crc kubenswrapper[4720]: I0121 14:51:05.389986 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" podStartSLOduration=5.389970538 podStartE2EDuration="5.389970538s" podCreationTimestamp="2026-01-21 14:51:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:51:05.388962222 +0000 UTC m=+1303.297702154" watchObservedRunningTime="2026-01-21 14:51:05.389970538 +0000 UTC m=+1303.298710470" Jan 21 14:51:05 crc kubenswrapper[4720]: I0121 14:51:05.474448 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.385495 4720 generic.go:334] "Generic (PLEG): container finished" podID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerID="2e4657d7e4e72bd0317d9a4097deb7850089e8ed3ac24d33b7e7cfaf4e9621ab" exitCode=0 Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.385735 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12d3be7d-16ff-43df-a7d5-266f2b1d4308","Type":"ContainerDied","Data":"2e4657d7e4e72bd0317d9a4097deb7850089e8ed3ac24d33b7e7cfaf4e9621ab"} Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.508707 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.590723 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-ceilometer-tls-certs\") pod \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.590773 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fwtp\" (UniqueName: \"kubernetes.io/projected/12d3be7d-16ff-43df-a7d5-266f2b1d4308-kube-api-access-2fwtp\") pod \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.590802 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-scripts\") pod \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.590839 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12d3be7d-16ff-43df-a7d5-266f2b1d4308-log-httpd\") pod \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.590870 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-config-data\") pod \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.590999 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-combined-ca-bundle\") pod \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.591039 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-sg-core-conf-yaml\") pod \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.591060 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12d3be7d-16ff-43df-a7d5-266f2b1d4308-run-httpd\") pod \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.591443 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12d3be7d-16ff-43df-a7d5-266f2b1d4308-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "12d3be7d-16ff-43df-a7d5-266f2b1d4308" (UID: "12d3be7d-16ff-43df-a7d5-266f2b1d4308"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.591562 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12d3be7d-16ff-43df-a7d5-266f2b1d4308-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "12d3be7d-16ff-43df-a7d5-266f2b1d4308" (UID: "12d3be7d-16ff-43df-a7d5-266f2b1d4308"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.596163 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-scripts" (OuterVolumeSpecName: "scripts") pod "12d3be7d-16ff-43df-a7d5-266f2b1d4308" (UID: "12d3be7d-16ff-43df-a7d5-266f2b1d4308"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.600029 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12d3be7d-16ff-43df-a7d5-266f2b1d4308-kube-api-access-2fwtp" (OuterVolumeSpecName: "kube-api-access-2fwtp") pod "12d3be7d-16ff-43df-a7d5-266f2b1d4308" (UID: "12d3be7d-16ff-43df-a7d5-266f2b1d4308"). InnerVolumeSpecName "kube-api-access-2fwtp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.615016 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "12d3be7d-16ff-43df-a7d5-266f2b1d4308" (UID: "12d3be7d-16ff-43df-a7d5-266f2b1d4308"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.663886 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "12d3be7d-16ff-43df-a7d5-266f2b1d4308" (UID: "12d3be7d-16ff-43df-a7d5-266f2b1d4308"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.693063 4720 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.693092 4720 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12d3be7d-16ff-43df-a7d5-266f2b1d4308-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.693101 4720 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.693111 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fwtp\" (UniqueName: \"kubernetes.io/projected/12d3be7d-16ff-43df-a7d5-266f2b1d4308-kube-api-access-2fwtp\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.693121 4720 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.693129 4720 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12d3be7d-16ff-43df-a7d5-266f2b1d4308-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.696173 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12d3be7d-16ff-43df-a7d5-266f2b1d4308" (UID: "12d3be7d-16ff-43df-a7d5-266f2b1d4308"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.708539 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-config-data" (OuterVolumeSpecName: "config-data") pod "12d3be7d-16ff-43df-a7d5-266f2b1d4308" (UID: "12d3be7d-16ff-43df-a7d5-266f2b1d4308"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.750076 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.796470 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.796510 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.414307 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.416694 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12d3be7d-16ff-43df-a7d5-266f2b1d4308","Type":"ContainerDied","Data":"095eb50a9535270990dd51b698c9cf80b5e404f52878168a52724d7ce2256d5b"} Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.416728 4720 scope.go:117] "RemoveContainer" containerID="e559aa0454450a58853ddf4a50d570c328194ddcfdc0c3464369f91c9c21dee1" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.459749 4720 scope.go:117] "RemoveContainer" containerID="00334da872f5838cba633c1df535018745aa753f28eccf9d38d17542b2d83557" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.468441 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.478515 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.506347 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:51:07 crc kubenswrapper[4720]: E0121 14:51:07.506825 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerName="sg-core" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.506840 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerName="sg-core" Jan 21 14:51:07 crc kubenswrapper[4720]: E0121 14:51:07.506854 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerName="ceilometer-notification-agent" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.506869 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerName="ceilometer-notification-agent" Jan 21 14:51:07 crc kubenswrapper[4720]: E0121 14:51:07.506892 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerName="ceilometer-central-agent" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.506900 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerName="ceilometer-central-agent" Jan 21 14:51:07 crc kubenswrapper[4720]: E0121 14:51:07.506922 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerName="proxy-httpd" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.506928 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerName="proxy-httpd" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.507095 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerName="proxy-httpd" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.507110 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerName="ceilometer-notification-agent" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.507118 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerName="ceilometer-central-agent" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.507134 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerName="sg-core" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.508814 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.511471 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.511782 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.518367 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.519099 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.519602 4720 scope.go:117] "RemoveContainer" containerID="2e4657d7e4e72bd0317d9a4097deb7850089e8ed3ac24d33b7e7cfaf4e9621ab" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.577938 4720 scope.go:117] "RemoveContainer" containerID="5d5bd9374ec63c57d2fd3d7df27f609498666bfefedad7b65b9e20757f2269d6" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.613011 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-config-data\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.613310 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-scripts\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.613429 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-log-httpd\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.613556 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.613698 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4czs\" (UniqueName: \"kubernetes.io/projected/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-kube-api-access-x4czs\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.613779 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-run-httpd\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.613873 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.613975 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.715964 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-scripts\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.716001 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-log-httpd\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.716023 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.716058 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4czs\" (UniqueName: \"kubernetes.io/projected/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-kube-api-access-x4czs\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.716076 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-run-httpd\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.716100 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.716116 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.716176 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-config-data\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.717728 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-log-httpd\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.717999 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-run-httpd\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.721231 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.723826 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.724245 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-scripts\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.724477 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-config-data\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.726689 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.740785 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4czs\" (UniqueName: \"kubernetes.io/projected/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-kube-api-access-x4czs\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.881775 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:51:08 crc kubenswrapper[4720]: I0121 14:51:08.372119 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:51:08 crc kubenswrapper[4720]: W0121 14:51:08.377015 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d3ebf5b_f0c4_472e_b4a3_e5f8cab66ffe.slice/crio-6145a1a426dc81c6cf4893eae54d0d3bb01c00551e04e8fd2aeb03ed6790fba7 WatchSource:0}: Error finding container 6145a1a426dc81c6cf4893eae54d0d3bb01c00551e04e8fd2aeb03ed6790fba7: Status 404 returned error can't find the container with id 6145a1a426dc81c6cf4893eae54d0d3bb01c00551e04e8fd2aeb03ed6790fba7 Jan 21 14:51:08 crc kubenswrapper[4720]: I0121 14:51:08.446492 4720 generic.go:334] "Generic (PLEG): container finished" podID="a072861f-6e44-4b30-8666-7dc9b0e2078e" containerID="58f63d184705ab6432d547aaa2cb911c01794d62ad6af01203240526c283f778" exitCode=0 Jan 21 14:51:08 crc kubenswrapper[4720]: I0121 14:51:08.447112 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a072861f-6e44-4b30-8666-7dc9b0e2078e","Type":"ContainerDied","Data":"58f63d184705ab6432d547aaa2cb911c01794d62ad6af01203240526c283f778"} Jan 21 14:51:08 crc kubenswrapper[4720]: I0121 14:51:08.451639 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe","Type":"ContainerStarted","Data":"6145a1a426dc81c6cf4893eae54d0d3bb01c00551e04e8fd2aeb03ed6790fba7"} Jan 21 14:51:08 crc kubenswrapper[4720]: I0121 14:51:08.611459 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 14:51:08 crc kubenswrapper[4720]: I0121 14:51:08.689176 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" path="/var/lib/kubelet/pods/12d3be7d-16ff-43df-a7d5-266f2b1d4308/volumes" Jan 21 14:51:08 crc kubenswrapper[4720]: I0121 14:51:08.736570 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a072861f-6e44-4b30-8666-7dc9b0e2078e-logs\") pod \"a072861f-6e44-4b30-8666-7dc9b0e2078e\" (UID: \"a072861f-6e44-4b30-8666-7dc9b0e2078e\") " Jan 21 14:51:08 crc kubenswrapper[4720]: I0121 14:51:08.736689 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxg24\" (UniqueName: \"kubernetes.io/projected/a072861f-6e44-4b30-8666-7dc9b0e2078e-kube-api-access-vxg24\") pod \"a072861f-6e44-4b30-8666-7dc9b0e2078e\" (UID: \"a072861f-6e44-4b30-8666-7dc9b0e2078e\") " Jan 21 14:51:08 crc kubenswrapper[4720]: I0121 14:51:08.736787 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a072861f-6e44-4b30-8666-7dc9b0e2078e-config-data\") pod \"a072861f-6e44-4b30-8666-7dc9b0e2078e\" (UID: \"a072861f-6e44-4b30-8666-7dc9b0e2078e\") " Jan 21 14:51:08 crc kubenswrapper[4720]: I0121 14:51:08.736842 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a072861f-6e44-4b30-8666-7dc9b0e2078e-combined-ca-bundle\") pod \"a072861f-6e44-4b30-8666-7dc9b0e2078e\" (UID: \"a072861f-6e44-4b30-8666-7dc9b0e2078e\") " Jan 21 14:51:08 crc kubenswrapper[4720]: I0121 14:51:08.737321 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a072861f-6e44-4b30-8666-7dc9b0e2078e-logs" (OuterVolumeSpecName: "logs") pod "a072861f-6e44-4b30-8666-7dc9b0e2078e" (UID: "a072861f-6e44-4b30-8666-7dc9b0e2078e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:51:08 crc kubenswrapper[4720]: I0121 14:51:08.746514 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a072861f-6e44-4b30-8666-7dc9b0e2078e-kube-api-access-vxg24" (OuterVolumeSpecName: "kube-api-access-vxg24") pod "a072861f-6e44-4b30-8666-7dc9b0e2078e" (UID: "a072861f-6e44-4b30-8666-7dc9b0e2078e"). InnerVolumeSpecName "kube-api-access-vxg24". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:51:08 crc kubenswrapper[4720]: I0121 14:51:08.772120 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a072861f-6e44-4b30-8666-7dc9b0e2078e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a072861f-6e44-4b30-8666-7dc9b0e2078e" (UID: "a072861f-6e44-4b30-8666-7dc9b0e2078e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:08 crc kubenswrapper[4720]: I0121 14:51:08.775325 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a072861f-6e44-4b30-8666-7dc9b0e2078e-config-data" (OuterVolumeSpecName: "config-data") pod "a072861f-6e44-4b30-8666-7dc9b0e2078e" (UID: "a072861f-6e44-4b30-8666-7dc9b0e2078e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:08 crc kubenswrapper[4720]: I0121 14:51:08.838863 4720 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a072861f-6e44-4b30-8666-7dc9b0e2078e-logs\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:08 crc kubenswrapper[4720]: I0121 14:51:08.838901 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxg24\" (UniqueName: \"kubernetes.io/projected/a072861f-6e44-4b30-8666-7dc9b0e2078e-kube-api-access-vxg24\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:08 crc kubenswrapper[4720]: I0121 14:51:08.838917 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a072861f-6e44-4b30-8666-7dc9b0e2078e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:08 crc kubenswrapper[4720]: I0121 14:51:08.838930 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a072861f-6e44-4b30-8666-7dc9b0e2078e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.461502 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe","Type":"ContainerStarted","Data":"27ad18ebd592c9ea8551cc0959933b0dd3090402e1c4eacace1304ec6a39a7d4"} Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.465673 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a072861f-6e44-4b30-8666-7dc9b0e2078e","Type":"ContainerDied","Data":"1ac2709f0fcedf3f81608ca1a0f69ad5080c7f047fe98d3ffad1aa7ecce36ad0"} Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.465723 4720 scope.go:117] "RemoveContainer" containerID="58f63d184705ab6432d547aaa2cb911c01794d62ad6af01203240526c283f778" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.465727 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.491360 4720 scope.go:117] "RemoveContainer" containerID="970bcf65e5a09e77255def09f218971569442aab8e3ec98881ef4af0f5c9e750" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.507201 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.519571 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.538404 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 21 14:51:09 crc kubenswrapper[4720]: E0121 14:51:09.538877 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a072861f-6e44-4b30-8666-7dc9b0e2078e" containerName="nova-api-log" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.538900 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="a072861f-6e44-4b30-8666-7dc9b0e2078e" containerName="nova-api-log" Jan 21 14:51:09 crc kubenswrapper[4720]: E0121 14:51:09.538916 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a072861f-6e44-4b30-8666-7dc9b0e2078e" containerName="nova-api-api" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.538925 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="a072861f-6e44-4b30-8666-7dc9b0e2078e" containerName="nova-api-api" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.539137 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="a072861f-6e44-4b30-8666-7dc9b0e2078e" containerName="nova-api-api" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.539154 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="a072861f-6e44-4b30-8666-7dc9b0e2078e" containerName="nova-api-log" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.540256 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.541626 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.542267 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.543719 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.556604 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.655123 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " pod="openstack/nova-api-0" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.655267 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/103562f8-b254-4684-80a8-5e6ff5160cfd-logs\") pod \"nova-api-0\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " pod="openstack/nova-api-0" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.655299 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-config-data\") pod \"nova-api-0\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " pod="openstack/nova-api-0" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.655315 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " pod="openstack/nova-api-0" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.655515 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f565v\" (UniqueName: \"kubernetes.io/projected/103562f8-b254-4684-80a8-5e6ff5160cfd-kube-api-access-f565v\") pod \"nova-api-0\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " pod="openstack/nova-api-0" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.655565 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-public-tls-certs\") pod \"nova-api-0\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " pod="openstack/nova-api-0" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.757141 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f565v\" (UniqueName: \"kubernetes.io/projected/103562f8-b254-4684-80a8-5e6ff5160cfd-kube-api-access-f565v\") pod \"nova-api-0\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " pod="openstack/nova-api-0" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.757186 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-public-tls-certs\") pod \"nova-api-0\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " pod="openstack/nova-api-0" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.757913 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " pod="openstack/nova-api-0" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.758168 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/103562f8-b254-4684-80a8-5e6ff5160cfd-logs\") pod \"nova-api-0\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " pod="openstack/nova-api-0" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.758247 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-config-data\") pod \"nova-api-0\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " pod="openstack/nova-api-0" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.758623 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/103562f8-b254-4684-80a8-5e6ff5160cfd-logs\") pod \"nova-api-0\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " pod="openstack/nova-api-0" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.758702 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " pod="openstack/nova-api-0" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.763214 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " pod="openstack/nova-api-0" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.763855 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-config-data\") pod \"nova-api-0\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " pod="openstack/nova-api-0" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.766068 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-public-tls-certs\") pod \"nova-api-0\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " pod="openstack/nova-api-0" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.770992 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " pod="openstack/nova-api-0" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.783462 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f565v\" (UniqueName: \"kubernetes.io/projected/103562f8-b254-4684-80a8-5e6ff5160cfd-kube-api-access-f565v\") pod \"nova-api-0\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " pod="openstack/nova-api-0" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.860533 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 14:51:10 crc kubenswrapper[4720]: I0121 14:51:10.377555 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 21 14:51:10 crc kubenswrapper[4720]: I0121 14:51:10.476984 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:51:10 crc kubenswrapper[4720]: I0121 14:51:10.515313 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"103562f8-b254-4684-80a8-5e6ff5160cfd","Type":"ContainerStarted","Data":"4bb2d185eecf673998e9d879b9729a3d12306e355feaebcc9b978ba415abebc0"} Jan 21 14:51:10 crc kubenswrapper[4720]: I0121 14:51:10.584048 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-mbq5w"] Jan 21 14:51:10 crc kubenswrapper[4720]: I0121 14:51:10.584320 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" podUID="0f51fb54-b6cb-4a03-b378-714f549cd2a1" containerName="dnsmasq-dns" containerID="cri-o://ec586fa0a9aadd81cd43534105b8c70d16abdfd751fc33f2fa3e2b01263a5d1d" gracePeriod=10 Jan 21 14:51:10 crc kubenswrapper[4720]: I0121 14:51:10.738615 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a072861f-6e44-4b30-8666-7dc9b0e2078e" path="/var/lib/kubelet/pods/a072861f-6e44-4b30-8666-7dc9b0e2078e/volumes" Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.239757 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" podUID="0f51fb54-b6cb-4a03-b378-714f549cd2a1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.173:5353: connect: connection refused" Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.524035 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe","Type":"ContainerStarted","Data":"af79cf129c292c74d6920bee5c752043bb2bfd7a327724ba70c9e06d8b214ebc"} Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.526330 4720 generic.go:334] "Generic (PLEG): container finished" podID="0f51fb54-b6cb-4a03-b378-714f549cd2a1" containerID="ec586fa0a9aadd81cd43534105b8c70d16abdfd751fc33f2fa3e2b01263a5d1d" exitCode=0 Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.526375 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" event={"ID":"0f51fb54-b6cb-4a03-b378-714f549cd2a1","Type":"ContainerDied","Data":"ec586fa0a9aadd81cd43534105b8c70d16abdfd751fc33f2fa3e2b01263a5d1d"} Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.527882 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"103562f8-b254-4684-80a8-5e6ff5160cfd","Type":"ContainerStarted","Data":"0d8f61c10eeb188ef15e3e02849cf9f08da31fa9b55964abb17c6472a740e2b3"} Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.669012 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.727562 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.742993 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.792221 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bnkz\" (UniqueName: \"kubernetes.io/projected/0f51fb54-b6cb-4a03-b378-714f549cd2a1-kube-api-access-7bnkz\") pod \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\" (UID: \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\") " Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.792281 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-ovsdbserver-nb\") pod \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\" (UID: \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\") " Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.792334 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-dns-svc\") pod \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\" (UID: \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\") " Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.792372 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-config\") pod \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\" (UID: \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\") " Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.792479 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-ovsdbserver-sb\") pod \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\" (UID: \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\") " Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.799966 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f51fb54-b6cb-4a03-b378-714f549cd2a1-kube-api-access-7bnkz" (OuterVolumeSpecName: "kube-api-access-7bnkz") pod "0f51fb54-b6cb-4a03-b378-714f549cd2a1" (UID: "0f51fb54-b6cb-4a03-b378-714f549cd2a1"). InnerVolumeSpecName "kube-api-access-7bnkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.845256 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0f51fb54-b6cb-4a03-b378-714f549cd2a1" (UID: "0f51fb54-b6cb-4a03-b378-714f549cd2a1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.850322 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0f51fb54-b6cb-4a03-b378-714f549cd2a1" (UID: "0f51fb54-b6cb-4a03-b378-714f549cd2a1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.899501 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bnkz\" (UniqueName: \"kubernetes.io/projected/0f51fb54-b6cb-4a03-b378-714f549cd2a1-kube-api-access-7bnkz\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.899532 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.899542 4720 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.918259 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-config" (OuterVolumeSpecName: "config") pod "0f51fb54-b6cb-4a03-b378-714f549cd2a1" (UID: "0f51fb54-b6cb-4a03-b378-714f549cd2a1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.943567 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0f51fb54-b6cb-4a03-b378-714f549cd2a1" (UID: "0f51fb54-b6cb-4a03-b378-714f549cd2a1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.001322 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.001686 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.538339 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe","Type":"ContainerStarted","Data":"aa28edecd3076d27d100618795053d2ff9074f6f62d9d1c55a24a5b14a96f5c5"} Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.540538 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" event={"ID":"0f51fb54-b6cb-4a03-b378-714f549cd2a1","Type":"ContainerDied","Data":"380b121eb00fcd604ad13adc40e1a9168307fc04dc188009c76758ceb903fd8f"} Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.540626 4720 scope.go:117] "RemoveContainer" containerID="ec586fa0a9aadd81cd43534105b8c70d16abdfd751fc33f2fa3e2b01263a5d1d" Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.540815 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.545910 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"103562f8-b254-4684-80a8-5e6ff5160cfd","Type":"ContainerStarted","Data":"534ecbf69be69a494a611503e1a8227ae178292371a91be06471fe6b3c4f7c22"} Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.566536 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.573564 4720 scope.go:117] "RemoveContainer" containerID="c53b9b942e3700ab88cecf03857239f8c69e629fa546f404751ee79be8529e6b" Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.581018 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.581000381 podStartE2EDuration="3.581000381s" podCreationTimestamp="2026-01-21 14:51:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:51:12.577353643 +0000 UTC m=+1310.486093575" watchObservedRunningTime="2026-01-21 14:51:12.581000381 +0000 UTC m=+1310.489740313" Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.615610 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-mbq5w"] Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.628183 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-mbq5w"] Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.700989 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f51fb54-b6cb-4a03-b378-714f549cd2a1" path="/var/lib/kubelet/pods/0f51fb54-b6cb-4a03-b378-714f549cd2a1/volumes" Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.807551 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-7qf47"] Jan 21 14:51:12 crc kubenswrapper[4720]: E0121 14:51:12.807928 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f51fb54-b6cb-4a03-b378-714f549cd2a1" containerName="init" Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.807946 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f51fb54-b6cb-4a03-b378-714f549cd2a1" containerName="init" Jan 21 14:51:12 crc kubenswrapper[4720]: E0121 14:51:12.807956 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f51fb54-b6cb-4a03-b378-714f549cd2a1" containerName="dnsmasq-dns" Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.807962 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f51fb54-b6cb-4a03-b378-714f549cd2a1" containerName="dnsmasq-dns" Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.809154 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f51fb54-b6cb-4a03-b378-714f549cd2a1" containerName="dnsmasq-dns" Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.809929 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7qf47" Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.812942 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.813134 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.821053 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-7qf47"] Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.927398 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7qf47\" (UID: \"d8fc07ed-67cb-4459-b7cb-ea8101ea4317\") " pod="openstack/nova-cell1-cell-mapping-7qf47" Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.927759 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-config-data\") pod \"nova-cell1-cell-mapping-7qf47\" (UID: \"d8fc07ed-67cb-4459-b7cb-ea8101ea4317\") " pod="openstack/nova-cell1-cell-mapping-7qf47" Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.928026 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkgml\" (UniqueName: \"kubernetes.io/projected/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-kube-api-access-rkgml\") pod \"nova-cell1-cell-mapping-7qf47\" (UID: \"d8fc07ed-67cb-4459-b7cb-ea8101ea4317\") " pod="openstack/nova-cell1-cell-mapping-7qf47" Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.928243 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-scripts\") pod \"nova-cell1-cell-mapping-7qf47\" (UID: \"d8fc07ed-67cb-4459-b7cb-ea8101ea4317\") " pod="openstack/nova-cell1-cell-mapping-7qf47" Jan 21 14:51:13 crc kubenswrapper[4720]: I0121 14:51:13.029523 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkgml\" (UniqueName: \"kubernetes.io/projected/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-kube-api-access-rkgml\") pod \"nova-cell1-cell-mapping-7qf47\" (UID: \"d8fc07ed-67cb-4459-b7cb-ea8101ea4317\") " pod="openstack/nova-cell1-cell-mapping-7qf47" Jan 21 14:51:13 crc kubenswrapper[4720]: I0121 14:51:13.029686 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-scripts\") pod \"nova-cell1-cell-mapping-7qf47\" (UID: \"d8fc07ed-67cb-4459-b7cb-ea8101ea4317\") " pod="openstack/nova-cell1-cell-mapping-7qf47" Jan 21 14:51:13 crc kubenswrapper[4720]: I0121 14:51:13.029718 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7qf47\" (UID: \"d8fc07ed-67cb-4459-b7cb-ea8101ea4317\") " pod="openstack/nova-cell1-cell-mapping-7qf47" Jan 21 14:51:13 crc kubenswrapper[4720]: I0121 14:51:13.029737 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-config-data\") pod \"nova-cell1-cell-mapping-7qf47\" (UID: \"d8fc07ed-67cb-4459-b7cb-ea8101ea4317\") " pod="openstack/nova-cell1-cell-mapping-7qf47" Jan 21 14:51:13 crc kubenswrapper[4720]: I0121 14:51:13.036055 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7qf47\" (UID: \"d8fc07ed-67cb-4459-b7cb-ea8101ea4317\") " pod="openstack/nova-cell1-cell-mapping-7qf47" Jan 21 14:51:13 crc kubenswrapper[4720]: I0121 14:51:13.036319 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-config-data\") pod \"nova-cell1-cell-mapping-7qf47\" (UID: \"d8fc07ed-67cb-4459-b7cb-ea8101ea4317\") " pod="openstack/nova-cell1-cell-mapping-7qf47" Jan 21 14:51:13 crc kubenswrapper[4720]: I0121 14:51:13.037044 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-scripts\") pod \"nova-cell1-cell-mapping-7qf47\" (UID: \"d8fc07ed-67cb-4459-b7cb-ea8101ea4317\") " pod="openstack/nova-cell1-cell-mapping-7qf47" Jan 21 14:51:13 crc kubenswrapper[4720]: I0121 14:51:13.054415 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkgml\" (UniqueName: \"kubernetes.io/projected/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-kube-api-access-rkgml\") pod \"nova-cell1-cell-mapping-7qf47\" (UID: \"d8fc07ed-67cb-4459-b7cb-ea8101ea4317\") " pod="openstack/nova-cell1-cell-mapping-7qf47" Jan 21 14:51:13 crc kubenswrapper[4720]: I0121 14:51:13.125755 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7qf47" Jan 21 14:51:13 crc kubenswrapper[4720]: I0121 14:51:13.569024 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe","Type":"ContainerStarted","Data":"c39eea449240cfba58293812deb344598d0485713df04e5bf59b49f0a0f0cbdf"} Jan 21 14:51:13 crc kubenswrapper[4720]: I0121 14:51:13.571201 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 21 14:51:13 crc kubenswrapper[4720]: W0121 14:51:13.626470 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8fc07ed_67cb_4459_b7cb_ea8101ea4317.slice/crio-81ffa8c47b30455ffc34b4535407b825c43082cf4376435684682b0464290caa WatchSource:0}: Error finding container 81ffa8c47b30455ffc34b4535407b825c43082cf4376435684682b0464290caa: Status 404 returned error can't find the container with id 81ffa8c47b30455ffc34b4535407b825c43082cf4376435684682b0464290caa Jan 21 14:51:13 crc kubenswrapper[4720]: I0121 14:51:13.632456 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.905445861 podStartE2EDuration="6.632437197s" podCreationTimestamp="2026-01-21 14:51:07 +0000 UTC" firstStartedPulling="2026-01-21 14:51:08.380379586 +0000 UTC m=+1306.289119508" lastFinishedPulling="2026-01-21 14:51:13.107370912 +0000 UTC m=+1311.016110844" observedRunningTime="2026-01-21 14:51:13.613692669 +0000 UTC m=+1311.522432621" watchObservedRunningTime="2026-01-21 14:51:13.632437197 +0000 UTC m=+1311.541177129" Jan 21 14:51:13 crc kubenswrapper[4720]: I0121 14:51:13.646576 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-7qf47"] Jan 21 14:51:14 crc kubenswrapper[4720]: I0121 14:51:14.344472 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-prdsm" Jan 21 14:51:14 crc kubenswrapper[4720]: I0121 14:51:14.398800 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-prdsm" Jan 21 14:51:14 crc kubenswrapper[4720]: I0121 14:51:14.579605 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7qf47" event={"ID":"d8fc07ed-67cb-4459-b7cb-ea8101ea4317","Type":"ContainerStarted","Data":"08798f35f080deb2759dc17480e0acb520080e74f20bec131db2674bbfdecfac"} Jan 21 14:51:14 crc kubenswrapper[4720]: I0121 14:51:14.579675 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7qf47" event={"ID":"d8fc07ed-67cb-4459-b7cb-ea8101ea4317","Type":"ContainerStarted","Data":"81ffa8c47b30455ffc34b4535407b825c43082cf4376435684682b0464290caa"} Jan 21 14:51:14 crc kubenswrapper[4720]: I0121 14:51:14.596360 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-7qf47" podStartSLOduration=2.596346232 podStartE2EDuration="2.596346232s" podCreationTimestamp="2026-01-21 14:51:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:51:14.59593515 +0000 UTC m=+1312.504675082" watchObservedRunningTime="2026-01-21 14:51:14.596346232 +0000 UTC m=+1312.505086164" Jan 21 14:51:15 crc kubenswrapper[4720]: I0121 14:51:15.165546 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-prdsm"] Jan 21 14:51:15 crc kubenswrapper[4720]: I0121 14:51:15.585987 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-prdsm" podUID="5825e26f-385a-4384-a0e6-18a04e49ddf7" containerName="registry-server" containerID="cri-o://0a9a0f7c5df4c76c0caa2f798f4b2308333087ebb4d4bf1855716ce187fae8d3" gracePeriod=2 Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.110111 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-prdsm" Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.235129 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5825e26f-385a-4384-a0e6-18a04e49ddf7-catalog-content\") pod \"5825e26f-385a-4384-a0e6-18a04e49ddf7\" (UID: \"5825e26f-385a-4384-a0e6-18a04e49ddf7\") " Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.235299 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mnj5\" (UniqueName: \"kubernetes.io/projected/5825e26f-385a-4384-a0e6-18a04e49ddf7-kube-api-access-5mnj5\") pod \"5825e26f-385a-4384-a0e6-18a04e49ddf7\" (UID: \"5825e26f-385a-4384-a0e6-18a04e49ddf7\") " Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.235412 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5825e26f-385a-4384-a0e6-18a04e49ddf7-utilities\") pod \"5825e26f-385a-4384-a0e6-18a04e49ddf7\" (UID: \"5825e26f-385a-4384-a0e6-18a04e49ddf7\") " Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.236435 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5825e26f-385a-4384-a0e6-18a04e49ddf7-utilities" (OuterVolumeSpecName: "utilities") pod "5825e26f-385a-4384-a0e6-18a04e49ddf7" (UID: "5825e26f-385a-4384-a0e6-18a04e49ddf7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.242169 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5825e26f-385a-4384-a0e6-18a04e49ddf7-kube-api-access-5mnj5" (OuterVolumeSpecName: "kube-api-access-5mnj5") pod "5825e26f-385a-4384-a0e6-18a04e49ddf7" (UID: "5825e26f-385a-4384-a0e6-18a04e49ddf7"). InnerVolumeSpecName "kube-api-access-5mnj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.337374 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5825e26f-385a-4384-a0e6-18a04e49ddf7-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.337424 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mnj5\" (UniqueName: \"kubernetes.io/projected/5825e26f-385a-4384-a0e6-18a04e49ddf7-kube-api-access-5mnj5\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.361786 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5825e26f-385a-4384-a0e6-18a04e49ddf7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5825e26f-385a-4384-a0e6-18a04e49ddf7" (UID: "5825e26f-385a-4384-a0e6-18a04e49ddf7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.438926 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5825e26f-385a-4384-a0e6-18a04e49ddf7-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.596228 4720 generic.go:334] "Generic (PLEG): container finished" podID="5825e26f-385a-4384-a0e6-18a04e49ddf7" containerID="0a9a0f7c5df4c76c0caa2f798f4b2308333087ebb4d4bf1855716ce187fae8d3" exitCode=0 Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.596536 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prdsm" event={"ID":"5825e26f-385a-4384-a0e6-18a04e49ddf7","Type":"ContainerDied","Data":"0a9a0f7c5df4c76c0caa2f798f4b2308333087ebb4d4bf1855716ce187fae8d3"} Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.596584 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prdsm" event={"ID":"5825e26f-385a-4384-a0e6-18a04e49ddf7","Type":"ContainerDied","Data":"abc55b9d285c58116da7b148c4e87092b6925a74a8b7d4e6ff71e53eb61cdc76"} Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.596602 4720 scope.go:117] "RemoveContainer" containerID="0a9a0f7c5df4c76c0caa2f798f4b2308333087ebb4d4bf1855716ce187fae8d3" Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.596760 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-prdsm" Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.637620 4720 scope.go:117] "RemoveContainer" containerID="2afc82d2610bfc44ba979738c0ceeef7f7d90c62e91e764f5284e10ea48d21ab" Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.666161 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-prdsm"] Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.674683 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-prdsm"] Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.679396 4720 scope.go:117] "RemoveContainer" containerID="8c7954ba9a69f367fe5dd10c0acf22cad0338ae74bc42c2b1b294122ddc99414" Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.691945 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5825e26f-385a-4384-a0e6-18a04e49ddf7" path="/var/lib/kubelet/pods/5825e26f-385a-4384-a0e6-18a04e49ddf7/volumes" Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.705034 4720 scope.go:117] "RemoveContainer" containerID="0a9a0f7c5df4c76c0caa2f798f4b2308333087ebb4d4bf1855716ce187fae8d3" Jan 21 14:51:16 crc kubenswrapper[4720]: E0121 14:51:16.705423 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a9a0f7c5df4c76c0caa2f798f4b2308333087ebb4d4bf1855716ce187fae8d3\": container with ID starting with 0a9a0f7c5df4c76c0caa2f798f4b2308333087ebb4d4bf1855716ce187fae8d3 not found: ID does not exist" containerID="0a9a0f7c5df4c76c0caa2f798f4b2308333087ebb4d4bf1855716ce187fae8d3" Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.705454 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a9a0f7c5df4c76c0caa2f798f4b2308333087ebb4d4bf1855716ce187fae8d3"} err="failed to get container status \"0a9a0f7c5df4c76c0caa2f798f4b2308333087ebb4d4bf1855716ce187fae8d3\": rpc error: code = NotFound desc = could not find container \"0a9a0f7c5df4c76c0caa2f798f4b2308333087ebb4d4bf1855716ce187fae8d3\": container with ID starting with 0a9a0f7c5df4c76c0caa2f798f4b2308333087ebb4d4bf1855716ce187fae8d3 not found: ID does not exist" Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.705481 4720 scope.go:117] "RemoveContainer" containerID="2afc82d2610bfc44ba979738c0ceeef7f7d90c62e91e764f5284e10ea48d21ab" Jan 21 14:51:16 crc kubenswrapper[4720]: E0121 14:51:16.705864 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2afc82d2610bfc44ba979738c0ceeef7f7d90c62e91e764f5284e10ea48d21ab\": container with ID starting with 2afc82d2610bfc44ba979738c0ceeef7f7d90c62e91e764f5284e10ea48d21ab not found: ID does not exist" containerID="2afc82d2610bfc44ba979738c0ceeef7f7d90c62e91e764f5284e10ea48d21ab" Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.705885 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2afc82d2610bfc44ba979738c0ceeef7f7d90c62e91e764f5284e10ea48d21ab"} err="failed to get container status \"2afc82d2610bfc44ba979738c0ceeef7f7d90c62e91e764f5284e10ea48d21ab\": rpc error: code = NotFound desc = could not find container \"2afc82d2610bfc44ba979738c0ceeef7f7d90c62e91e764f5284e10ea48d21ab\": container with ID starting with 2afc82d2610bfc44ba979738c0ceeef7f7d90c62e91e764f5284e10ea48d21ab not found: ID does not exist" Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.705900 4720 scope.go:117] "RemoveContainer" containerID="8c7954ba9a69f367fe5dd10c0acf22cad0338ae74bc42c2b1b294122ddc99414" Jan 21 14:51:16 crc kubenswrapper[4720]: E0121 14:51:16.706118 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c7954ba9a69f367fe5dd10c0acf22cad0338ae74bc42c2b1b294122ddc99414\": container with ID starting with 8c7954ba9a69f367fe5dd10c0acf22cad0338ae74bc42c2b1b294122ddc99414 not found: ID does not exist" containerID="8c7954ba9a69f367fe5dd10c0acf22cad0338ae74bc42c2b1b294122ddc99414" Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.706165 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c7954ba9a69f367fe5dd10c0acf22cad0338ae74bc42c2b1b294122ddc99414"} err="failed to get container status \"8c7954ba9a69f367fe5dd10c0acf22cad0338ae74bc42c2b1b294122ddc99414\": rpc error: code = NotFound desc = could not find container \"8c7954ba9a69f367fe5dd10c0acf22cad0338ae74bc42c2b1b294122ddc99414\": container with ID starting with 8c7954ba9a69f367fe5dd10c0acf22cad0338ae74bc42c2b1b294122ddc99414 not found: ID does not exist" Jan 21 14:51:18 crc kubenswrapper[4720]: I0121 14:51:18.615962 4720 generic.go:334] "Generic (PLEG): container finished" podID="d8fc07ed-67cb-4459-b7cb-ea8101ea4317" containerID="08798f35f080deb2759dc17480e0acb520080e74f20bec131db2674bbfdecfac" exitCode=0 Jan 21 14:51:18 crc kubenswrapper[4720]: I0121 14:51:18.616041 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7qf47" event={"ID":"d8fc07ed-67cb-4459-b7cb-ea8101ea4317","Type":"ContainerDied","Data":"08798f35f080deb2759dc17480e0acb520080e74f20bec131db2674bbfdecfac"} Jan 21 14:51:19 crc kubenswrapper[4720]: I0121 14:51:19.861174 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 21 14:51:19 crc kubenswrapper[4720]: I0121 14:51:19.861489 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 21 14:51:19 crc kubenswrapper[4720]: I0121 14:51:19.979691 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7qf47" Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.123750 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-scripts\") pod \"d8fc07ed-67cb-4459-b7cb-ea8101ea4317\" (UID: \"d8fc07ed-67cb-4459-b7cb-ea8101ea4317\") " Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.123862 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkgml\" (UniqueName: \"kubernetes.io/projected/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-kube-api-access-rkgml\") pod \"d8fc07ed-67cb-4459-b7cb-ea8101ea4317\" (UID: \"d8fc07ed-67cb-4459-b7cb-ea8101ea4317\") " Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.123895 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-config-data\") pod \"d8fc07ed-67cb-4459-b7cb-ea8101ea4317\" (UID: \"d8fc07ed-67cb-4459-b7cb-ea8101ea4317\") " Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.123955 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-combined-ca-bundle\") pod \"d8fc07ed-67cb-4459-b7cb-ea8101ea4317\" (UID: \"d8fc07ed-67cb-4459-b7cb-ea8101ea4317\") " Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.130611 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-kube-api-access-rkgml" (OuterVolumeSpecName: "kube-api-access-rkgml") pod "d8fc07ed-67cb-4459-b7cb-ea8101ea4317" (UID: "d8fc07ed-67cb-4459-b7cb-ea8101ea4317"). InnerVolumeSpecName "kube-api-access-rkgml". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.152082 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-scripts" (OuterVolumeSpecName: "scripts") pod "d8fc07ed-67cb-4459-b7cb-ea8101ea4317" (UID: "d8fc07ed-67cb-4459-b7cb-ea8101ea4317"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.157940 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-config-data" (OuterVolumeSpecName: "config-data") pod "d8fc07ed-67cb-4459-b7cb-ea8101ea4317" (UID: "d8fc07ed-67cb-4459-b7cb-ea8101ea4317"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.163880 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8fc07ed-67cb-4459-b7cb-ea8101ea4317" (UID: "d8fc07ed-67cb-4459-b7cb-ea8101ea4317"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.226186 4720 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.226222 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkgml\" (UniqueName: \"kubernetes.io/projected/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-kube-api-access-rkgml\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.226235 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.226243 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.638797 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7qf47" event={"ID":"d8fc07ed-67cb-4459-b7cb-ea8101ea4317","Type":"ContainerDied","Data":"81ffa8c47b30455ffc34b4535407b825c43082cf4376435684682b0464290caa"} Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.639062 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81ffa8c47b30455ffc34b4535407b825c43082cf4376435684682b0464290caa" Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.639150 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7qf47" Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.846813 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.847330 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="103562f8-b254-4684-80a8-5e6ff5160cfd" containerName="nova-api-log" containerID="cri-o://0d8f61c10eeb188ef15e3e02849cf9f08da31fa9b55964abb17c6472a740e2b3" gracePeriod=30 Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.847416 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="103562f8-b254-4684-80a8-5e6ff5160cfd" containerName="nova-api-api" containerID="cri-o://534ecbf69be69a494a611503e1a8227ae178292371a91be06471fe6b3c4f7c22" gracePeriod=30 Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.858026 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.858253 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="c65466e3-8bac-41f3-855f-202b0a6f9e82" containerName="nova-scheduler-scheduler" containerID="cri-o://0229617962c429f7fcb15b291fd09407ae1937ea495d5cde0e4d779fd887ce9f" gracePeriod=30 Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.876909 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.877128 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cc263e55-641f-47c7-ac02-f863d7cafa11" containerName="nova-metadata-log" containerID="cri-o://9057ae4407090c3f0de53130117d8031d3cff0a8a571aa339b38f2bb43a521c6" gracePeriod=30 Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.877532 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cc263e55-641f-47c7-ac02-f863d7cafa11" containerName="nova-metadata-metadata" containerID="cri-o://2f1103123dd5e075dacd8fa477da83f98abac31eddfad5f03662900eaa3eaf4f" gracePeriod=30 Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.880221 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="103562f8-b254-4684-80a8-5e6ff5160cfd" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.186:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.880221 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="103562f8-b254-4684-80a8-5e6ff5160cfd" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.186:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 14:51:21 crc kubenswrapper[4720]: I0121 14:51:21.649918 4720 generic.go:334] "Generic (PLEG): container finished" podID="cc263e55-641f-47c7-ac02-f863d7cafa11" containerID="9057ae4407090c3f0de53130117d8031d3cff0a8a571aa339b38f2bb43a521c6" exitCode=143 Jan 21 14:51:21 crc kubenswrapper[4720]: I0121 14:51:21.649988 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cc263e55-641f-47c7-ac02-f863d7cafa11","Type":"ContainerDied","Data":"9057ae4407090c3f0de53130117d8031d3cff0a8a571aa339b38f2bb43a521c6"} Jan 21 14:51:21 crc kubenswrapper[4720]: I0121 14:51:21.651837 4720 generic.go:334] "Generic (PLEG): container finished" podID="103562f8-b254-4684-80a8-5e6ff5160cfd" containerID="0d8f61c10eeb188ef15e3e02849cf9f08da31fa9b55964abb17c6472a740e2b3" exitCode=143 Jan 21 14:51:21 crc kubenswrapper[4720]: I0121 14:51:21.651897 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"103562f8-b254-4684-80a8-5e6ff5160cfd","Type":"ContainerDied","Data":"0d8f61c10eeb188ef15e3e02849cf9f08da31fa9b55964abb17c6472a740e2b3"} Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.234807 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.384808 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c65466e3-8bac-41f3-855f-202b0a6f9e82-config-data\") pod \"c65466e3-8bac-41f3-855f-202b0a6f9e82\" (UID: \"c65466e3-8bac-41f3-855f-202b0a6f9e82\") " Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.385005 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c65466e3-8bac-41f3-855f-202b0a6f9e82-combined-ca-bundle\") pod \"c65466e3-8bac-41f3-855f-202b0a6f9e82\" (UID: \"c65466e3-8bac-41f3-855f-202b0a6f9e82\") " Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.385043 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gw47\" (UniqueName: \"kubernetes.io/projected/c65466e3-8bac-41f3-855f-202b0a6f9e82-kube-api-access-5gw47\") pod \"c65466e3-8bac-41f3-855f-202b0a6f9e82\" (UID: \"c65466e3-8bac-41f3-855f-202b0a6f9e82\") " Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.398599 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c65466e3-8bac-41f3-855f-202b0a6f9e82-kube-api-access-5gw47" (OuterVolumeSpecName: "kube-api-access-5gw47") pod "c65466e3-8bac-41f3-855f-202b0a6f9e82" (UID: "c65466e3-8bac-41f3-855f-202b0a6f9e82"). InnerVolumeSpecName "kube-api-access-5gw47". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.411824 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c65466e3-8bac-41f3-855f-202b0a6f9e82-config-data" (OuterVolumeSpecName: "config-data") pod "c65466e3-8bac-41f3-855f-202b0a6f9e82" (UID: "c65466e3-8bac-41f3-855f-202b0a6f9e82"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.421645 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c65466e3-8bac-41f3-855f-202b0a6f9e82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c65466e3-8bac-41f3-855f-202b0a6f9e82" (UID: "c65466e3-8bac-41f3-855f-202b0a6f9e82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.487827 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c65466e3-8bac-41f3-855f-202b0a6f9e82-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.487877 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gw47\" (UniqueName: \"kubernetes.io/projected/c65466e3-8bac-41f3-855f-202b0a6f9e82-kube-api-access-5gw47\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.487896 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c65466e3-8bac-41f3-855f-202b0a6f9e82-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.672629 4720 generic.go:334] "Generic (PLEG): container finished" podID="c65466e3-8bac-41f3-855f-202b0a6f9e82" containerID="0229617962c429f7fcb15b291fd09407ae1937ea495d5cde0e4d779fd887ce9f" exitCode=0 Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.672688 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c65466e3-8bac-41f3-855f-202b0a6f9e82","Type":"ContainerDied","Data":"0229617962c429f7fcb15b291fd09407ae1937ea495d5cde0e4d779fd887ce9f"} Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.673095 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c65466e3-8bac-41f3-855f-202b0a6f9e82","Type":"ContainerDied","Data":"6a4188e9bbe7707a1cbd5fc7c33ecb7166835f18ea14b69fcb7fc8e351f09029"} Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.672707 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.673165 4720 scope.go:117] "RemoveContainer" containerID="0229617962c429f7fcb15b291fd09407ae1937ea495d5cde0e4d779fd887ce9f" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.697259 4720 scope.go:117] "RemoveContainer" containerID="0229617962c429f7fcb15b291fd09407ae1937ea495d5cde0e4d779fd887ce9f" Jan 21 14:51:23 crc kubenswrapper[4720]: E0121 14:51:23.697634 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0229617962c429f7fcb15b291fd09407ae1937ea495d5cde0e4d779fd887ce9f\": container with ID starting with 0229617962c429f7fcb15b291fd09407ae1937ea495d5cde0e4d779fd887ce9f not found: ID does not exist" containerID="0229617962c429f7fcb15b291fd09407ae1937ea495d5cde0e4d779fd887ce9f" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.697776 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0229617962c429f7fcb15b291fd09407ae1937ea495d5cde0e4d779fd887ce9f"} err="failed to get container status \"0229617962c429f7fcb15b291fd09407ae1937ea495d5cde0e4d779fd887ce9f\": rpc error: code = NotFound desc = could not find container \"0229617962c429f7fcb15b291fd09407ae1937ea495d5cde0e4d779fd887ce9f\": container with ID starting with 0229617962c429f7fcb15b291fd09407ae1937ea495d5cde0e4d779fd887ce9f not found: ID does not exist" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.719542 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.727992 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.733695 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 14:51:23 crc kubenswrapper[4720]: E0121 14:51:23.734046 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5825e26f-385a-4384-a0e6-18a04e49ddf7" containerName="registry-server" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.734065 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="5825e26f-385a-4384-a0e6-18a04e49ddf7" containerName="registry-server" Jan 21 14:51:23 crc kubenswrapper[4720]: E0121 14:51:23.734085 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5825e26f-385a-4384-a0e6-18a04e49ddf7" containerName="extract-utilities" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.734091 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="5825e26f-385a-4384-a0e6-18a04e49ddf7" containerName="extract-utilities" Jan 21 14:51:23 crc kubenswrapper[4720]: E0121 14:51:23.734106 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c65466e3-8bac-41f3-855f-202b0a6f9e82" containerName="nova-scheduler-scheduler" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.734112 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="c65466e3-8bac-41f3-855f-202b0a6f9e82" containerName="nova-scheduler-scheduler" Jan 21 14:51:23 crc kubenswrapper[4720]: E0121 14:51:23.734125 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8fc07ed-67cb-4459-b7cb-ea8101ea4317" containerName="nova-manage" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.734131 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8fc07ed-67cb-4459-b7cb-ea8101ea4317" containerName="nova-manage" Jan 21 14:51:23 crc kubenswrapper[4720]: E0121 14:51:23.734143 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5825e26f-385a-4384-a0e6-18a04e49ddf7" containerName="extract-content" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.734150 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="5825e26f-385a-4384-a0e6-18a04e49ddf7" containerName="extract-content" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.734293 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="5825e26f-385a-4384-a0e6-18a04e49ddf7" containerName="registry-server" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.734309 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="c65466e3-8bac-41f3-855f-202b0a6f9e82" containerName="nova-scheduler-scheduler" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.734323 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8fc07ed-67cb-4459-b7cb-ea8101ea4317" containerName="nova-manage" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.734841 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.738374 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.756766 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.793079 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/039c7115-f471-47ad-a7c4-75b1d7a40a94-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"039c7115-f471-47ad-a7c4-75b1d7a40a94\") " pod="openstack/nova-scheduler-0" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.793120 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkg4f\" (UniqueName: \"kubernetes.io/projected/039c7115-f471-47ad-a7c4-75b1d7a40a94-kube-api-access-jkg4f\") pod \"nova-scheduler-0\" (UID: \"039c7115-f471-47ad-a7c4-75b1d7a40a94\") " pod="openstack/nova-scheduler-0" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.793150 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/039c7115-f471-47ad-a7c4-75b1d7a40a94-config-data\") pod \"nova-scheduler-0\" (UID: \"039c7115-f471-47ad-a7c4-75b1d7a40a94\") " pod="openstack/nova-scheduler-0" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.894308 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/039c7115-f471-47ad-a7c4-75b1d7a40a94-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"039c7115-f471-47ad-a7c4-75b1d7a40a94\") " pod="openstack/nova-scheduler-0" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.894363 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkg4f\" (UniqueName: \"kubernetes.io/projected/039c7115-f471-47ad-a7c4-75b1d7a40a94-kube-api-access-jkg4f\") pod \"nova-scheduler-0\" (UID: \"039c7115-f471-47ad-a7c4-75b1d7a40a94\") " pod="openstack/nova-scheduler-0" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.894387 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/039c7115-f471-47ad-a7c4-75b1d7a40a94-config-data\") pod \"nova-scheduler-0\" (UID: \"039c7115-f471-47ad-a7c4-75b1d7a40a94\") " pod="openstack/nova-scheduler-0" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.897973 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/039c7115-f471-47ad-a7c4-75b1d7a40a94-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"039c7115-f471-47ad-a7c4-75b1d7a40a94\") " pod="openstack/nova-scheduler-0" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.899118 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/039c7115-f471-47ad-a7c4-75b1d7a40a94-config-data\") pod \"nova-scheduler-0\" (UID: \"039c7115-f471-47ad-a7c4-75b1d7a40a94\") " pod="openstack/nova-scheduler-0" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.913397 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkg4f\" (UniqueName: \"kubernetes.io/projected/039c7115-f471-47ad-a7c4-75b1d7a40a94-kube-api-access-jkg4f\") pod \"nova-scheduler-0\" (UID: \"039c7115-f471-47ad-a7c4-75b1d7a40a94\") " pod="openstack/nova-scheduler-0" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.017158 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="cc263e55-641f-47c7-ac02-f863d7cafa11" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.178:8775/\": read tcp 10.217.0.2:38262->10.217.0.178:8775: read: connection reset by peer" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.017593 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="cc263e55-641f-47c7-ac02-f863d7cafa11" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.178:8775/\": read tcp 10.217.0.2:38254->10.217.0.178:8775: read: connection reset by peer" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.050189 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.490401 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.493205 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.605210 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc263e55-641f-47c7-ac02-f863d7cafa11-combined-ca-bundle\") pod \"cc263e55-641f-47c7-ac02-f863d7cafa11\" (UID: \"cc263e55-641f-47c7-ac02-f863d7cafa11\") " Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.605273 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc263e55-641f-47c7-ac02-f863d7cafa11-config-data\") pod \"cc263e55-641f-47c7-ac02-f863d7cafa11\" (UID: \"cc263e55-641f-47c7-ac02-f863d7cafa11\") " Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.605453 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vs682\" (UniqueName: \"kubernetes.io/projected/cc263e55-641f-47c7-ac02-f863d7cafa11-kube-api-access-vs682\") pod \"cc263e55-641f-47c7-ac02-f863d7cafa11\" (UID: \"cc263e55-641f-47c7-ac02-f863d7cafa11\") " Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.605500 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc263e55-641f-47c7-ac02-f863d7cafa11-logs\") pod \"cc263e55-641f-47c7-ac02-f863d7cafa11\" (UID: \"cc263e55-641f-47c7-ac02-f863d7cafa11\") " Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.605611 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc263e55-641f-47c7-ac02-f863d7cafa11-nova-metadata-tls-certs\") pod \"cc263e55-641f-47c7-ac02-f863d7cafa11\" (UID: \"cc263e55-641f-47c7-ac02-f863d7cafa11\") " Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.607930 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc263e55-641f-47c7-ac02-f863d7cafa11-logs" (OuterVolumeSpecName: "logs") pod "cc263e55-641f-47c7-ac02-f863d7cafa11" (UID: "cc263e55-641f-47c7-ac02-f863d7cafa11"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.613851 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc263e55-641f-47c7-ac02-f863d7cafa11-kube-api-access-vs682" (OuterVolumeSpecName: "kube-api-access-vs682") pod "cc263e55-641f-47c7-ac02-f863d7cafa11" (UID: "cc263e55-641f-47c7-ac02-f863d7cafa11"). InnerVolumeSpecName "kube-api-access-vs682". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.641036 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc263e55-641f-47c7-ac02-f863d7cafa11-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc263e55-641f-47c7-ac02-f863d7cafa11" (UID: "cc263e55-641f-47c7-ac02-f863d7cafa11"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.641866 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc263e55-641f-47c7-ac02-f863d7cafa11-config-data" (OuterVolumeSpecName: "config-data") pod "cc263e55-641f-47c7-ac02-f863d7cafa11" (UID: "cc263e55-641f-47c7-ac02-f863d7cafa11"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.680214 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc263e55-641f-47c7-ac02-f863d7cafa11-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "cc263e55-641f-47c7-ac02-f863d7cafa11" (UID: "cc263e55-641f-47c7-ac02-f863d7cafa11"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.687930 4720 generic.go:334] "Generic (PLEG): container finished" podID="cc263e55-641f-47c7-ac02-f863d7cafa11" containerID="2f1103123dd5e075dacd8fa477da83f98abac31eddfad5f03662900eaa3eaf4f" exitCode=0 Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.688013 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.701427 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c65466e3-8bac-41f3-855f-202b0a6f9e82" path="/var/lib/kubelet/pods/c65466e3-8bac-41f3-855f-202b0a6f9e82/volumes" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.702586 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"039c7115-f471-47ad-a7c4-75b1d7a40a94","Type":"ContainerStarted","Data":"8b5443bd6e0295f14b7abeec2709f0ba24bba33a6203357b06cd4d671535736d"} Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.702614 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cc263e55-641f-47c7-ac02-f863d7cafa11","Type":"ContainerDied","Data":"2f1103123dd5e075dacd8fa477da83f98abac31eddfad5f03662900eaa3eaf4f"} Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.702630 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cc263e55-641f-47c7-ac02-f863d7cafa11","Type":"ContainerDied","Data":"9bd73af5fd59322a2bd5b4dadb3b5852cd6bfb2cf195e8e11949965c74ef70f1"} Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.702649 4720 scope.go:117] "RemoveContainer" containerID="2f1103123dd5e075dacd8fa477da83f98abac31eddfad5f03662900eaa3eaf4f" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.709458 4720 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc263e55-641f-47c7-ac02-f863d7cafa11-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.709493 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc263e55-641f-47c7-ac02-f863d7cafa11-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.709504 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc263e55-641f-47c7-ac02-f863d7cafa11-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.709515 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vs682\" (UniqueName: \"kubernetes.io/projected/cc263e55-641f-47c7-ac02-f863d7cafa11-kube-api-access-vs682\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.709526 4720 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc263e55-641f-47c7-ac02-f863d7cafa11-logs\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.734997 4720 scope.go:117] "RemoveContainer" containerID="9057ae4407090c3f0de53130117d8031d3cff0a8a571aa339b38f2bb43a521c6" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.744126 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.752688 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.768514 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:51:24 crc kubenswrapper[4720]: E0121 14:51:24.768967 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc263e55-641f-47c7-ac02-f863d7cafa11" containerName="nova-metadata-metadata" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.768985 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc263e55-641f-47c7-ac02-f863d7cafa11" containerName="nova-metadata-metadata" Jan 21 14:51:24 crc kubenswrapper[4720]: E0121 14:51:24.769015 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc263e55-641f-47c7-ac02-f863d7cafa11" containerName="nova-metadata-log" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.769022 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc263e55-641f-47c7-ac02-f863d7cafa11" containerName="nova-metadata-log" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.769178 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc263e55-641f-47c7-ac02-f863d7cafa11" containerName="nova-metadata-metadata" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.769198 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc263e55-641f-47c7-ac02-f863d7cafa11" containerName="nova-metadata-log" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.771635 4720 scope.go:117] "RemoveContainer" containerID="2f1103123dd5e075dacd8fa477da83f98abac31eddfad5f03662900eaa3eaf4f" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.777230 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 14:51:24 crc kubenswrapper[4720]: E0121 14:51:24.778482 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f1103123dd5e075dacd8fa477da83f98abac31eddfad5f03662900eaa3eaf4f\": container with ID starting with 2f1103123dd5e075dacd8fa477da83f98abac31eddfad5f03662900eaa3eaf4f not found: ID does not exist" containerID="2f1103123dd5e075dacd8fa477da83f98abac31eddfad5f03662900eaa3eaf4f" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.778520 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f1103123dd5e075dacd8fa477da83f98abac31eddfad5f03662900eaa3eaf4f"} err="failed to get container status \"2f1103123dd5e075dacd8fa477da83f98abac31eddfad5f03662900eaa3eaf4f\": rpc error: code = NotFound desc = could not find container \"2f1103123dd5e075dacd8fa477da83f98abac31eddfad5f03662900eaa3eaf4f\": container with ID starting with 2f1103123dd5e075dacd8fa477da83f98abac31eddfad5f03662900eaa3eaf4f not found: ID does not exist" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.778590 4720 scope.go:117] "RemoveContainer" containerID="9057ae4407090c3f0de53130117d8031d3cff0a8a571aa339b38f2bb43a521c6" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.779425 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.779792 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 21 14:51:24 crc kubenswrapper[4720]: E0121 14:51:24.781237 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9057ae4407090c3f0de53130117d8031d3cff0a8a571aa339b38f2bb43a521c6\": container with ID starting with 9057ae4407090c3f0de53130117d8031d3cff0a8a571aa339b38f2bb43a521c6 not found: ID does not exist" containerID="9057ae4407090c3f0de53130117d8031d3cff0a8a571aa339b38f2bb43a521c6" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.781274 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9057ae4407090c3f0de53130117d8031d3cff0a8a571aa339b38f2bb43a521c6"} err="failed to get container status \"9057ae4407090c3f0de53130117d8031d3cff0a8a571aa339b38f2bb43a521c6\": rpc error: code = NotFound desc = could not find container \"9057ae4407090c3f0de53130117d8031d3cff0a8a571aa339b38f2bb43a521c6\": container with ID starting with 9057ae4407090c3f0de53130117d8031d3cff0a8a571aa339b38f2bb43a521c6 not found: ID does not exist" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.802716 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.810264 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7177980c-4db3-4902-aac2-c0825b778b2a-config-data\") pod \"nova-metadata-0\" (UID: \"7177980c-4db3-4902-aac2-c0825b778b2a\") " pod="openstack/nova-metadata-0" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.810315 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7177980c-4db3-4902-aac2-c0825b778b2a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7177980c-4db3-4902-aac2-c0825b778b2a\") " pod="openstack/nova-metadata-0" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.810340 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7177980c-4db3-4902-aac2-c0825b778b2a-logs\") pod \"nova-metadata-0\" (UID: \"7177980c-4db3-4902-aac2-c0825b778b2a\") " pod="openstack/nova-metadata-0" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.810434 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7177980c-4db3-4902-aac2-c0825b778b2a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7177980c-4db3-4902-aac2-c0825b778b2a\") " pod="openstack/nova-metadata-0" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.810532 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq5ff\" (UniqueName: \"kubernetes.io/projected/7177980c-4db3-4902-aac2-c0825b778b2a-kube-api-access-kq5ff\") pod \"nova-metadata-0\" (UID: \"7177980c-4db3-4902-aac2-c0825b778b2a\") " pod="openstack/nova-metadata-0" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.913866 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq5ff\" (UniqueName: \"kubernetes.io/projected/7177980c-4db3-4902-aac2-c0825b778b2a-kube-api-access-kq5ff\") pod \"nova-metadata-0\" (UID: \"7177980c-4db3-4902-aac2-c0825b778b2a\") " pod="openstack/nova-metadata-0" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.913942 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7177980c-4db3-4902-aac2-c0825b778b2a-config-data\") pod \"nova-metadata-0\" (UID: \"7177980c-4db3-4902-aac2-c0825b778b2a\") " pod="openstack/nova-metadata-0" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.913962 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7177980c-4db3-4902-aac2-c0825b778b2a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7177980c-4db3-4902-aac2-c0825b778b2a\") " pod="openstack/nova-metadata-0" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.913983 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7177980c-4db3-4902-aac2-c0825b778b2a-logs\") pod \"nova-metadata-0\" (UID: \"7177980c-4db3-4902-aac2-c0825b778b2a\") " pod="openstack/nova-metadata-0" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.914023 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7177980c-4db3-4902-aac2-c0825b778b2a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7177980c-4db3-4902-aac2-c0825b778b2a\") " pod="openstack/nova-metadata-0" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.915283 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7177980c-4db3-4902-aac2-c0825b778b2a-logs\") pod \"nova-metadata-0\" (UID: \"7177980c-4db3-4902-aac2-c0825b778b2a\") " pod="openstack/nova-metadata-0" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.919821 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7177980c-4db3-4902-aac2-c0825b778b2a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7177980c-4db3-4902-aac2-c0825b778b2a\") " pod="openstack/nova-metadata-0" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.926144 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7177980c-4db3-4902-aac2-c0825b778b2a-config-data\") pod \"nova-metadata-0\" (UID: \"7177980c-4db3-4902-aac2-c0825b778b2a\") " pod="openstack/nova-metadata-0" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.929357 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7177980c-4db3-4902-aac2-c0825b778b2a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7177980c-4db3-4902-aac2-c0825b778b2a\") " pod="openstack/nova-metadata-0" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.934339 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq5ff\" (UniqueName: \"kubernetes.io/projected/7177980c-4db3-4902-aac2-c0825b778b2a-kube-api-access-kq5ff\") pod \"nova-metadata-0\" (UID: \"7177980c-4db3-4902-aac2-c0825b778b2a\") " pod="openstack/nova-metadata-0" Jan 21 14:51:25 crc kubenswrapper[4720]: I0121 14:51:25.103892 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 14:51:25 crc kubenswrapper[4720]: I0121 14:51:25.588225 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:51:25 crc kubenswrapper[4720]: I0121 14:51:25.698320 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7177980c-4db3-4902-aac2-c0825b778b2a","Type":"ContainerStarted","Data":"3398fbc0a33494e3166dc5a170a67b9b49df448837581f4a1cc4f92a0e2e21de"} Jan 21 14:51:25 crc kubenswrapper[4720]: I0121 14:51:25.699620 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"039c7115-f471-47ad-a7c4-75b1d7a40a94","Type":"ContainerStarted","Data":"c53420e4b575d7a9b57dd50454fd5a1bb2b67341f75d699a3bd146fa3d5c109b"} Jan 21 14:51:25 crc kubenswrapper[4720]: I0121 14:51:25.717834 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.717791289 podStartE2EDuration="2.717791289s" podCreationTimestamp="2026-01-21 14:51:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:51:25.715698582 +0000 UTC m=+1323.624438534" watchObservedRunningTime="2026-01-21 14:51:25.717791289 +0000 UTC m=+1323.626531231" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.704120 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc263e55-641f-47c7-ac02-f863d7cafa11" path="/var/lib/kubelet/pods/cc263e55-641f-47c7-ac02-f863d7cafa11/volumes" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.731013 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.731243 4720 generic.go:334] "Generic (PLEG): container finished" podID="103562f8-b254-4684-80a8-5e6ff5160cfd" containerID="534ecbf69be69a494a611503e1a8227ae178292371a91be06471fe6b3c4f7c22" exitCode=0 Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.731378 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"103562f8-b254-4684-80a8-5e6ff5160cfd","Type":"ContainerDied","Data":"534ecbf69be69a494a611503e1a8227ae178292371a91be06471fe6b3c4f7c22"} Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.731455 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"103562f8-b254-4684-80a8-5e6ff5160cfd","Type":"ContainerDied","Data":"4bb2d185eecf673998e9d879b9729a3d12306e355feaebcc9b978ba415abebc0"} Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.731477 4720 scope.go:117] "RemoveContainer" containerID="534ecbf69be69a494a611503e1a8227ae178292371a91be06471fe6b3c4f7c22" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.735351 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7177980c-4db3-4902-aac2-c0825b778b2a","Type":"ContainerStarted","Data":"0ccd76eb91ea3922e26d1bd6af83a1762bbeac0de1ab86d5fbb2cc92068f0849"} Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.735436 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7177980c-4db3-4902-aac2-c0825b778b2a","Type":"ContainerStarted","Data":"37a84f7f848bd89032dcbc59d157fe027e7dd905add3c4033c58af74f35ebc9e"} Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.752563 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-combined-ca-bundle\") pod \"103562f8-b254-4684-80a8-5e6ff5160cfd\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.752618 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/103562f8-b254-4684-80a8-5e6ff5160cfd-logs\") pod \"103562f8-b254-4684-80a8-5e6ff5160cfd\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.752640 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f565v\" (UniqueName: \"kubernetes.io/projected/103562f8-b254-4684-80a8-5e6ff5160cfd-kube-api-access-f565v\") pod \"103562f8-b254-4684-80a8-5e6ff5160cfd\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.752694 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-public-tls-certs\") pod \"103562f8-b254-4684-80a8-5e6ff5160cfd\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.752753 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-internal-tls-certs\") pod \"103562f8-b254-4684-80a8-5e6ff5160cfd\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.752806 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-config-data\") pod \"103562f8-b254-4684-80a8-5e6ff5160cfd\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.755893 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/103562f8-b254-4684-80a8-5e6ff5160cfd-logs" (OuterVolumeSpecName: "logs") pod "103562f8-b254-4684-80a8-5e6ff5160cfd" (UID: "103562f8-b254-4684-80a8-5e6ff5160cfd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.763445 4720 scope.go:117] "RemoveContainer" containerID="0d8f61c10eeb188ef15e3e02849cf9f08da31fa9b55964abb17c6472a740e2b3" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.807032 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "103562f8-b254-4684-80a8-5e6ff5160cfd" (UID: "103562f8-b254-4684-80a8-5e6ff5160cfd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.809438 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/103562f8-b254-4684-80a8-5e6ff5160cfd-kube-api-access-f565v" (OuterVolumeSpecName: "kube-api-access-f565v") pod "103562f8-b254-4684-80a8-5e6ff5160cfd" (UID: "103562f8-b254-4684-80a8-5e6ff5160cfd"). InnerVolumeSpecName "kube-api-access-f565v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.816930 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-config-data" (OuterVolumeSpecName: "config-data") pod "103562f8-b254-4684-80a8-5e6ff5160cfd" (UID: "103562f8-b254-4684-80a8-5e6ff5160cfd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.827022 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.82699777 podStartE2EDuration="2.82699777s" podCreationTimestamp="2026-01-21 14:51:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:51:26.807785859 +0000 UTC m=+1324.716525821" watchObservedRunningTime="2026-01-21 14:51:26.82699777 +0000 UTC m=+1324.735737702" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.840773 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "103562f8-b254-4684-80a8-5e6ff5160cfd" (UID: "103562f8-b254-4684-80a8-5e6ff5160cfd"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.851634 4720 scope.go:117] "RemoveContainer" containerID="534ecbf69be69a494a611503e1a8227ae178292371a91be06471fe6b3c4f7c22" Jan 21 14:51:26 crc kubenswrapper[4720]: E0121 14:51:26.853824 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"534ecbf69be69a494a611503e1a8227ae178292371a91be06471fe6b3c4f7c22\": container with ID starting with 534ecbf69be69a494a611503e1a8227ae178292371a91be06471fe6b3c4f7c22 not found: ID does not exist" containerID="534ecbf69be69a494a611503e1a8227ae178292371a91be06471fe6b3c4f7c22" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.853864 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"534ecbf69be69a494a611503e1a8227ae178292371a91be06471fe6b3c4f7c22"} err="failed to get container status \"534ecbf69be69a494a611503e1a8227ae178292371a91be06471fe6b3c4f7c22\": rpc error: code = NotFound desc = could not find container \"534ecbf69be69a494a611503e1a8227ae178292371a91be06471fe6b3c4f7c22\": container with ID starting with 534ecbf69be69a494a611503e1a8227ae178292371a91be06471fe6b3c4f7c22 not found: ID does not exist" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.853890 4720 scope.go:117] "RemoveContainer" containerID="0d8f61c10eeb188ef15e3e02849cf9f08da31fa9b55964abb17c6472a740e2b3" Jan 21 14:51:26 crc kubenswrapper[4720]: E0121 14:51:26.854151 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d8f61c10eeb188ef15e3e02849cf9f08da31fa9b55964abb17c6472a740e2b3\": container with ID starting with 0d8f61c10eeb188ef15e3e02849cf9f08da31fa9b55964abb17c6472a740e2b3 not found: ID does not exist" containerID="0d8f61c10eeb188ef15e3e02849cf9f08da31fa9b55964abb17c6472a740e2b3" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.854173 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d8f61c10eeb188ef15e3e02849cf9f08da31fa9b55964abb17c6472a740e2b3"} err="failed to get container status \"0d8f61c10eeb188ef15e3e02849cf9f08da31fa9b55964abb17c6472a740e2b3\": rpc error: code = NotFound desc = could not find container \"0d8f61c10eeb188ef15e3e02849cf9f08da31fa9b55964abb17c6472a740e2b3\": container with ID starting with 0d8f61c10eeb188ef15e3e02849cf9f08da31fa9b55964abb17c6472a740e2b3 not found: ID does not exist" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.854294 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.854364 4720 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/103562f8-b254-4684-80a8-5e6ff5160cfd-logs\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.854374 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f565v\" (UniqueName: \"kubernetes.io/projected/103562f8-b254-4684-80a8-5e6ff5160cfd-kube-api-access-f565v\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.854385 4720 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.854394 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.863694 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "103562f8-b254-4684-80a8-5e6ff5160cfd" (UID: "103562f8-b254-4684-80a8-5e6ff5160cfd"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.955634 4720 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.750588 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.790753 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.797753 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.815832 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 21 14:51:27 crc kubenswrapper[4720]: E0121 14:51:27.816264 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="103562f8-b254-4684-80a8-5e6ff5160cfd" containerName="nova-api-log" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.816285 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="103562f8-b254-4684-80a8-5e6ff5160cfd" containerName="nova-api-log" Jan 21 14:51:27 crc kubenswrapper[4720]: E0121 14:51:27.816311 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="103562f8-b254-4684-80a8-5e6ff5160cfd" containerName="nova-api-api" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.816319 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="103562f8-b254-4684-80a8-5e6ff5160cfd" containerName="nova-api-api" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.816513 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="103562f8-b254-4684-80a8-5e6ff5160cfd" containerName="nova-api-api" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.816552 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="103562f8-b254-4684-80a8-5e6ff5160cfd" containerName="nova-api-log" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.821365 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.823615 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.824787 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.824974 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.849278 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.870453 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33c62270-7ab4-416b-bf5f-e0007f477733-config-data\") pod \"nova-api-0\" (UID: \"33c62270-7ab4-416b-bf5f-e0007f477733\") " pod="openstack/nova-api-0" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.870536 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33c62270-7ab4-416b-bf5f-e0007f477733-logs\") pod \"nova-api-0\" (UID: \"33c62270-7ab4-416b-bf5f-e0007f477733\") " pod="openstack/nova-api-0" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.870742 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c62270-7ab4-416b-bf5f-e0007f477733-internal-tls-certs\") pod \"nova-api-0\" (UID: \"33c62270-7ab4-416b-bf5f-e0007f477733\") " pod="openstack/nova-api-0" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.870827 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c62270-7ab4-416b-bf5f-e0007f477733-public-tls-certs\") pod \"nova-api-0\" (UID: \"33c62270-7ab4-416b-bf5f-e0007f477733\") " pod="openstack/nova-api-0" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.870878 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp97j\" (UniqueName: \"kubernetes.io/projected/33c62270-7ab4-416b-bf5f-e0007f477733-kube-api-access-pp97j\") pod \"nova-api-0\" (UID: \"33c62270-7ab4-416b-bf5f-e0007f477733\") " pod="openstack/nova-api-0" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.871035 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33c62270-7ab4-416b-bf5f-e0007f477733-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"33c62270-7ab4-416b-bf5f-e0007f477733\") " pod="openstack/nova-api-0" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.971813 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33c62270-7ab4-416b-bf5f-e0007f477733-logs\") pod \"nova-api-0\" (UID: \"33c62270-7ab4-416b-bf5f-e0007f477733\") " pod="openstack/nova-api-0" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.971871 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c62270-7ab4-416b-bf5f-e0007f477733-internal-tls-certs\") pod \"nova-api-0\" (UID: \"33c62270-7ab4-416b-bf5f-e0007f477733\") " pod="openstack/nova-api-0" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.971896 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c62270-7ab4-416b-bf5f-e0007f477733-public-tls-certs\") pod \"nova-api-0\" (UID: \"33c62270-7ab4-416b-bf5f-e0007f477733\") " pod="openstack/nova-api-0" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.971915 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp97j\" (UniqueName: \"kubernetes.io/projected/33c62270-7ab4-416b-bf5f-e0007f477733-kube-api-access-pp97j\") pod \"nova-api-0\" (UID: \"33c62270-7ab4-416b-bf5f-e0007f477733\") " pod="openstack/nova-api-0" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.971956 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33c62270-7ab4-416b-bf5f-e0007f477733-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"33c62270-7ab4-416b-bf5f-e0007f477733\") " pod="openstack/nova-api-0" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.972000 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33c62270-7ab4-416b-bf5f-e0007f477733-config-data\") pod \"nova-api-0\" (UID: \"33c62270-7ab4-416b-bf5f-e0007f477733\") " pod="openstack/nova-api-0" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.972949 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33c62270-7ab4-416b-bf5f-e0007f477733-logs\") pod \"nova-api-0\" (UID: \"33c62270-7ab4-416b-bf5f-e0007f477733\") " pod="openstack/nova-api-0" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.977316 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c62270-7ab4-416b-bf5f-e0007f477733-internal-tls-certs\") pod \"nova-api-0\" (UID: \"33c62270-7ab4-416b-bf5f-e0007f477733\") " pod="openstack/nova-api-0" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.977405 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33c62270-7ab4-416b-bf5f-e0007f477733-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"33c62270-7ab4-416b-bf5f-e0007f477733\") " pod="openstack/nova-api-0" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.978171 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33c62270-7ab4-416b-bf5f-e0007f477733-config-data\") pod \"nova-api-0\" (UID: \"33c62270-7ab4-416b-bf5f-e0007f477733\") " pod="openstack/nova-api-0" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.980138 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c62270-7ab4-416b-bf5f-e0007f477733-public-tls-certs\") pod \"nova-api-0\" (UID: \"33c62270-7ab4-416b-bf5f-e0007f477733\") " pod="openstack/nova-api-0" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.989470 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp97j\" (UniqueName: \"kubernetes.io/projected/33c62270-7ab4-416b-bf5f-e0007f477733-kube-api-access-pp97j\") pod \"nova-api-0\" (UID: \"33c62270-7ab4-416b-bf5f-e0007f477733\") " pod="openstack/nova-api-0" Jan 21 14:51:28 crc kubenswrapper[4720]: I0121 14:51:28.147833 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 14:51:28 crc kubenswrapper[4720]: I0121 14:51:28.579108 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 21 14:51:28 crc kubenswrapper[4720]: W0121 14:51:28.585297 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33c62270_7ab4_416b_bf5f_e0007f477733.slice/crio-3b3464da7ab2557469796af7d2c954c5455a2bd8281b747d9094c865089c84b9 WatchSource:0}: Error finding container 3b3464da7ab2557469796af7d2c954c5455a2bd8281b747d9094c865089c84b9: Status 404 returned error can't find the container with id 3b3464da7ab2557469796af7d2c954c5455a2bd8281b747d9094c865089c84b9 Jan 21 14:51:28 crc kubenswrapper[4720]: I0121 14:51:28.687444 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="103562f8-b254-4684-80a8-5e6ff5160cfd" path="/var/lib/kubelet/pods/103562f8-b254-4684-80a8-5e6ff5160cfd/volumes" Jan 21 14:51:28 crc kubenswrapper[4720]: I0121 14:51:28.766186 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"33c62270-7ab4-416b-bf5f-e0007f477733","Type":"ContainerStarted","Data":"570c387bcd4121303196e6f33a68b38bb1c295552bf10b14bf44b961167bcd92"} Jan 21 14:51:28 crc kubenswrapper[4720]: I0121 14:51:28.766244 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"33c62270-7ab4-416b-bf5f-e0007f477733","Type":"ContainerStarted","Data":"3b3464da7ab2557469796af7d2c954c5455a2bd8281b747d9094c865089c84b9"} Jan 21 14:51:29 crc kubenswrapper[4720]: I0121 14:51:29.050375 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 21 14:51:29 crc kubenswrapper[4720]: I0121 14:51:29.776633 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"33c62270-7ab4-416b-bf5f-e0007f477733","Type":"ContainerStarted","Data":"341932c24f87afeb662859ab55a926abbf6d0d46a11f9f65cfe94db229d33f65"} Jan 21 14:51:30 crc kubenswrapper[4720]: I0121 14:51:30.104514 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 21 14:51:30 crc kubenswrapper[4720]: I0121 14:51:30.104573 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 21 14:51:34 crc kubenswrapper[4720]: I0121 14:51:34.050495 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 21 14:51:34 crc kubenswrapper[4720]: I0121 14:51:34.076647 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 21 14:51:34 crc kubenswrapper[4720]: I0121 14:51:34.094101 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=7.094078334 podStartE2EDuration="7.094078334s" podCreationTimestamp="2026-01-21 14:51:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:51:29.796581483 +0000 UTC m=+1327.705321435" watchObservedRunningTime="2026-01-21 14:51:34.094078334 +0000 UTC m=+1332.002818286" Jan 21 14:51:34 crc kubenswrapper[4720]: I0121 14:51:34.861437 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 21 14:51:35 crc kubenswrapper[4720]: I0121 14:51:35.105129 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 21 14:51:35 crc kubenswrapper[4720]: I0121 14:51:35.105188 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 21 14:51:36 crc kubenswrapper[4720]: I0121 14:51:36.120863 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7177980c-4db3-4902-aac2-c0825b778b2a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.189:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 14:51:36 crc kubenswrapper[4720]: I0121 14:51:36.121192 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7177980c-4db3-4902-aac2-c0825b778b2a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.189:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 14:51:37 crc kubenswrapper[4720]: I0121 14:51:37.892133 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 21 14:51:38 crc kubenswrapper[4720]: I0121 14:51:38.148676 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 21 14:51:38 crc kubenswrapper[4720]: I0121 14:51:38.148849 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 21 14:51:39 crc kubenswrapper[4720]: I0121 14:51:39.156884 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="33c62270-7ab4-416b-bf5f-e0007f477733" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.190:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 14:51:39 crc kubenswrapper[4720]: I0121 14:51:39.163908 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="33c62270-7ab4-416b-bf5f-e0007f477733" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.190:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 14:51:45 crc kubenswrapper[4720]: I0121 14:51:45.110076 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 21 14:51:45 crc kubenswrapper[4720]: I0121 14:51:45.113044 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 21 14:51:45 crc kubenswrapper[4720]: I0121 14:51:45.116857 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 21 14:51:45 crc kubenswrapper[4720]: I0121 14:51:45.947425 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 21 14:51:48 crc kubenswrapper[4720]: I0121 14:51:48.155985 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 21 14:51:48 crc kubenswrapper[4720]: I0121 14:51:48.157078 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 21 14:51:48 crc kubenswrapper[4720]: I0121 14:51:48.158041 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 21 14:51:48 crc kubenswrapper[4720]: I0121 14:51:48.179613 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 21 14:51:48 crc kubenswrapper[4720]: I0121 14:51:48.957429 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 21 14:51:48 crc kubenswrapper[4720]: I0121 14:51:48.966055 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 21 14:51:58 crc kubenswrapper[4720]: I0121 14:51:58.046672 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 21 14:51:59 crc kubenswrapper[4720]: I0121 14:51:59.751045 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 21 14:52:02 crc kubenswrapper[4720]: I0121 14:52:02.672486 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="3a2eafda-c352-4311-94d5-a1aec1422699" containerName="rabbitmq" containerID="cri-o://41acd62d6994c3b333557260be3b41ae84ff11452b3f18db90c86f45eaee7f6c" gracePeriod=604796 Jan 21 14:52:03 crc kubenswrapper[4720]: I0121 14:52:03.726146 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="c1752995-abec-46de-adf8-da9e3ed99d4a" containerName="rabbitmq" containerID="cri-o://9c861cf27787d0df1915de176ea7b338ba9e65e509d7002abe91b7eb691fa61e" gracePeriod=604797 Jan 21 14:52:07 crc kubenswrapper[4720]: I0121 14:52:07.444927 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="3a2eafda-c352-4311-94d5-a1aec1422699" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Jan 21 14:52:07 crc kubenswrapper[4720]: I0121 14:52:07.869432 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="c1752995-abec-46de-adf8-da9e3ed99d4a" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.116946 4720 generic.go:334] "Generic (PLEG): container finished" podID="3a2eafda-c352-4311-94d5-a1aec1422699" containerID="41acd62d6994c3b333557260be3b41ae84ff11452b3f18db90c86f45eaee7f6c" exitCode=0 Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.117552 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3a2eafda-c352-4311-94d5-a1aec1422699","Type":"ContainerDied","Data":"41acd62d6994c3b333557260be3b41ae84ff11452b3f18db90c86f45eaee7f6c"} Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.220106 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.352215 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-confd\") pod \"3a2eafda-c352-4311-94d5-a1aec1422699\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.352272 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-plugins\") pod \"3a2eafda-c352-4311-94d5-a1aec1422699\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.352299 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lndbj\" (UniqueName: \"kubernetes.io/projected/3a2eafda-c352-4311-94d5-a1aec1422699-kube-api-access-lndbj\") pod \"3a2eafda-c352-4311-94d5-a1aec1422699\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.352390 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3a2eafda-c352-4311-94d5-a1aec1422699-server-conf\") pod \"3a2eafda-c352-4311-94d5-a1aec1422699\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.352436 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3a2eafda-c352-4311-94d5-a1aec1422699-pod-info\") pod \"3a2eafda-c352-4311-94d5-a1aec1422699\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.352486 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-erlang-cookie\") pod \"3a2eafda-c352-4311-94d5-a1aec1422699\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.352570 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a2eafda-c352-4311-94d5-a1aec1422699-config-data\") pod \"3a2eafda-c352-4311-94d5-a1aec1422699\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.352614 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"3a2eafda-c352-4311-94d5-a1aec1422699\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.352641 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3a2eafda-c352-4311-94d5-a1aec1422699-plugins-conf\") pod \"3a2eafda-c352-4311-94d5-a1aec1422699\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.352690 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-tls\") pod \"3a2eafda-c352-4311-94d5-a1aec1422699\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.352716 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3a2eafda-c352-4311-94d5-a1aec1422699-erlang-cookie-secret\") pod \"3a2eafda-c352-4311-94d5-a1aec1422699\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.353978 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "3a2eafda-c352-4311-94d5-a1aec1422699" (UID: "3a2eafda-c352-4311-94d5-a1aec1422699"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.357067 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a2eafda-c352-4311-94d5-a1aec1422699-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "3a2eafda-c352-4311-94d5-a1aec1422699" (UID: "3a2eafda-c352-4311-94d5-a1aec1422699"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.357436 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "3a2eafda-c352-4311-94d5-a1aec1422699" (UID: "3a2eafda-c352-4311-94d5-a1aec1422699"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.360059 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a2eafda-c352-4311-94d5-a1aec1422699-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "3a2eafda-c352-4311-94d5-a1aec1422699" (UID: "3a2eafda-c352-4311-94d5-a1aec1422699"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.360376 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/3a2eafda-c352-4311-94d5-a1aec1422699-pod-info" (OuterVolumeSpecName: "pod-info") pod "3a2eafda-c352-4311-94d5-a1aec1422699" (UID: "3a2eafda-c352-4311-94d5-a1aec1422699"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.360818 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "3a2eafda-c352-4311-94d5-a1aec1422699" (UID: "3a2eafda-c352-4311-94d5-a1aec1422699"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.374769 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "3a2eafda-c352-4311-94d5-a1aec1422699" (UID: "3a2eafda-c352-4311-94d5-a1aec1422699"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.374928 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a2eafda-c352-4311-94d5-a1aec1422699-kube-api-access-lndbj" (OuterVolumeSpecName: "kube-api-access-lndbj") pod "3a2eafda-c352-4311-94d5-a1aec1422699" (UID: "3a2eafda-c352-4311-94d5-a1aec1422699"). InnerVolumeSpecName "kube-api-access-lndbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.391576 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a2eafda-c352-4311-94d5-a1aec1422699-config-data" (OuterVolumeSpecName: "config-data") pod "3a2eafda-c352-4311-94d5-a1aec1422699" (UID: "3a2eafda-c352-4311-94d5-a1aec1422699"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.455020 4720 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3a2eafda-c352-4311-94d5-a1aec1422699-pod-info\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.455296 4720 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.455418 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a2eafda-c352-4311-94d5-a1aec1422699-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.455545 4720 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.455618 4720 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3a2eafda-c352-4311-94d5-a1aec1422699-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.455695 4720 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.455752 4720 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3a2eafda-c352-4311-94d5-a1aec1422699-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.455815 4720 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.455950 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lndbj\" (UniqueName: \"kubernetes.io/projected/3a2eafda-c352-4311-94d5-a1aec1422699-kube-api-access-lndbj\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.462184 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a2eafda-c352-4311-94d5-a1aec1422699-server-conf" (OuterVolumeSpecName: "server-conf") pod "3a2eafda-c352-4311-94d5-a1aec1422699" (UID: "3a2eafda-c352-4311-94d5-a1aec1422699"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.490506 4720 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.499693 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "3a2eafda-c352-4311-94d5-a1aec1422699" (UID: "3a2eafda-c352-4311-94d5-a1aec1422699"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.558156 4720 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.558187 4720 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.558200 4720 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3a2eafda-c352-4311-94d5-a1aec1422699-server-conf\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.135937 4720 generic.go:334] "Generic (PLEG): container finished" podID="c1752995-abec-46de-adf8-da9e3ed99d4a" containerID="9c861cf27787d0df1915de176ea7b338ba9e65e509d7002abe91b7eb691fa61e" exitCode=0 Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.136196 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c1752995-abec-46de-adf8-da9e3ed99d4a","Type":"ContainerDied","Data":"9c861cf27787d0df1915de176ea7b338ba9e65e509d7002abe91b7eb691fa61e"} Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.149453 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3a2eafda-c352-4311-94d5-a1aec1422699","Type":"ContainerDied","Data":"da6b6b430f12d2b56cf212530b8e484bf3b8d0da1c76e1f2c9cac8d57f6efdf2"} Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.149487 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.149509 4720 scope.go:117] "RemoveContainer" containerID="41acd62d6994c3b333557260be3b41ae84ff11452b3f18db90c86f45eaee7f6c" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.182597 4720 scope.go:117] "RemoveContainer" containerID="c4453d3c9ef59902e453daa4adc4cd400e16b0fd0ef2955bff89215fad4b9aed" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.201832 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.209092 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.231841 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 21 14:52:10 crc kubenswrapper[4720]: E0121 14:52:10.232279 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a2eafda-c352-4311-94d5-a1aec1422699" containerName="setup-container" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.232301 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a2eafda-c352-4311-94d5-a1aec1422699" containerName="setup-container" Jan 21 14:52:10 crc kubenswrapper[4720]: E0121 14:52:10.232317 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a2eafda-c352-4311-94d5-a1aec1422699" containerName="rabbitmq" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.232326 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a2eafda-c352-4311-94d5-a1aec1422699" containerName="rabbitmq" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.232548 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a2eafda-c352-4311-94d5-a1aec1422699" containerName="rabbitmq" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.233712 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.242405 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.243761 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.243990 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.244264 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-qrxkj" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.244400 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.246021 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.253151 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.269949 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.352084 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.396782 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.397133 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.397257 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfpn9\" (UniqueName: \"kubernetes.io/projected/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-kube-api-access-sfpn9\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.397372 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.397524 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.397687 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.397832 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.397953 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.398100 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.398257 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.398368 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-config-data\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.499404 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c1752995-abec-46de-adf8-da9e3ed99d4a-plugins-conf\") pod \"c1752995-abec-46de-adf8-da9e3ed99d4a\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.499452 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c1752995-abec-46de-adf8-da9e3ed99d4a-server-conf\") pod \"c1752995-abec-46de-adf8-da9e3ed99d4a\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.499542 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c1752995-abec-46de-adf8-da9e3ed99d4a-erlang-cookie-secret\") pod \"c1752995-abec-46de-adf8-da9e3ed99d4a\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.499574 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-erlang-cookie\") pod \"c1752995-abec-46de-adf8-da9e3ed99d4a\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.499603 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-confd\") pod \"c1752995-abec-46de-adf8-da9e3ed99d4a\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.499784 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4f5db\" (UniqueName: \"kubernetes.io/projected/c1752995-abec-46de-adf8-da9e3ed99d4a-kube-api-access-4f5db\") pod \"c1752995-abec-46de-adf8-da9e3ed99d4a\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.499824 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-tls\") pod \"c1752995-abec-46de-adf8-da9e3ed99d4a\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.499847 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c1752995-abec-46de-adf8-da9e3ed99d4a-pod-info\") pod \"c1752995-abec-46de-adf8-da9e3ed99d4a\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.499930 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"c1752995-abec-46de-adf8-da9e3ed99d4a\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.499950 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-plugins\") pod \"c1752995-abec-46de-adf8-da9e3ed99d4a\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.499973 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c1752995-abec-46de-adf8-da9e3ed99d4a-config-data\") pod \"c1752995-abec-46de-adf8-da9e3ed99d4a\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.500171 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-config-data\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.500230 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.500252 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.500268 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfpn9\" (UniqueName: \"kubernetes.io/projected/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-kube-api-access-sfpn9\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.500290 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.500326 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.500351 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.500384 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.500408 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.500447 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.500480 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.500956 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.501503 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1752995-abec-46de-adf8-da9e3ed99d4a-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "c1752995-abec-46de-adf8-da9e3ed99d4a" (UID: "c1752995-abec-46de-adf8-da9e3ed99d4a"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.502294 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-config-data\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.508851 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.509057 4720 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.509338 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1752995-abec-46de-adf8-da9e3ed99d4a-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "c1752995-abec-46de-adf8-da9e3ed99d4a" (UID: "c1752995-abec-46de-adf8-da9e3ed99d4a"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.510340 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1752995-abec-46de-adf8-da9e3ed99d4a-kube-api-access-4f5db" (OuterVolumeSpecName: "kube-api-access-4f5db") pod "c1752995-abec-46de-adf8-da9e3ed99d4a" (UID: "c1752995-abec-46de-adf8-da9e3ed99d4a"). InnerVolumeSpecName "kube-api-access-4f5db". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.510964 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "c1752995-abec-46de-adf8-da9e3ed99d4a" (UID: "c1752995-abec-46de-adf8-da9e3ed99d4a"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.513256 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.513584 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.514518 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.515141 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/c1752995-abec-46de-adf8-da9e3ed99d4a-pod-info" (OuterVolumeSpecName: "pod-info") pod "c1752995-abec-46de-adf8-da9e3ed99d4a" (UID: "c1752995-abec-46de-adf8-da9e3ed99d4a"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.516523 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.517833 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "c1752995-abec-46de-adf8-da9e3ed99d4a" (UID: "c1752995-abec-46de-adf8-da9e3ed99d4a"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.526994 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.529247 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "c1752995-abec-46de-adf8-da9e3ed99d4a" (UID: "c1752995-abec-46de-adf8-da9e3ed99d4a"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.535489 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "c1752995-abec-46de-adf8-da9e3ed99d4a" (UID: "c1752995-abec-46de-adf8-da9e3ed99d4a"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.538709 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.550240 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfpn9\" (UniqueName: \"kubernetes.io/projected/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-kube-api-access-sfpn9\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.560144 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1752995-abec-46de-adf8-da9e3ed99d4a-config-data" (OuterVolumeSpecName: "config-data") pod "c1752995-abec-46de-adf8-da9e3ed99d4a" (UID: "c1752995-abec-46de-adf8-da9e3ed99d4a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.561912 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.602570 4720 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c1752995-abec-46de-adf8-da9e3ed99d4a-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.602809 4720 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.603174 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4f5db\" (UniqueName: \"kubernetes.io/projected/c1752995-abec-46de-adf8-da9e3ed99d4a-kube-api-access-4f5db\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.603548 4720 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.603614 4720 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c1752995-abec-46de-adf8-da9e3ed99d4a-pod-info\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.603712 4720 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.603793 4720 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.603855 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c1752995-abec-46de-adf8-da9e3ed99d4a-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.603912 4720 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c1752995-abec-46de-adf8-da9e3ed99d4a-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.610823 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1752995-abec-46de-adf8-da9e3ed99d4a-server-conf" (OuterVolumeSpecName: "server-conf") pod "c1752995-abec-46de-adf8-da9e3ed99d4a" (UID: "c1752995-abec-46de-adf8-da9e3ed99d4a"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.623590 4720 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.641914 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.667479 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "c1752995-abec-46de-adf8-da9e3ed99d4a" (UID: "c1752995-abec-46de-adf8-da9e3ed99d4a"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.689942 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a2eafda-c352-4311-94d5-a1aec1422699" path="/var/lib/kubelet/pods/3a2eafda-c352-4311-94d5-a1aec1422699/volumes" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.705046 4720 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.705074 4720 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.705083 4720 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c1752995-abec-46de-adf8-da9e3ed99d4a-server-conf\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.090987 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.168816 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c1752995-abec-46de-adf8-da9e3ed99d4a","Type":"ContainerDied","Data":"348934cdbf75477f1ab960f3f1053dff6dbf9d2daa8c4387234ea6851e521a6d"} Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.168870 4720 scope.go:117] "RemoveContainer" containerID="9c861cf27787d0df1915de176ea7b338ba9e65e509d7002abe91b7eb691fa61e" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.168867 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.171127 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f73dd82b-9ad1-4deb-b244-6d42a3f25f89","Type":"ContainerStarted","Data":"ae02bad546a79ede0b3c6acafdd6a20aac4e570c51c42547e3c791db88948b01"} Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.236792 4720 scope.go:117] "RemoveContainer" containerID="c805233f5325caf425e355c639bbb38416823bf3012c2a9fbf778f7b0bf437ea" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.263969 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.287133 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.333638 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 21 14:52:11 crc kubenswrapper[4720]: E0121 14:52:11.333970 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1752995-abec-46de-adf8-da9e3ed99d4a" containerName="rabbitmq" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.333984 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1752995-abec-46de-adf8-da9e3ed99d4a" containerName="rabbitmq" Jan 21 14:52:11 crc kubenswrapper[4720]: E0121 14:52:11.334012 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1752995-abec-46de-adf8-da9e3ed99d4a" containerName="setup-container" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.334019 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1752995-abec-46de-adf8-da9e3ed99d4a" containerName="setup-container" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.334167 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1752995-abec-46de-adf8-da9e3ed99d4a" containerName="rabbitmq" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.335022 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.337960 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.338120 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.338279 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.338420 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.340994 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.340993 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-d7vj6" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.344441 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.360899 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.417901 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4906b5ed-c663-4e81-ab33-2b8f33777cd1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.417957 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4906b5ed-c663-4e81-ab33-2b8f33777cd1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.417987 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4906b5ed-c663-4e81-ab33-2b8f33777cd1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.418074 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4906b5ed-c663-4e81-ab33-2b8f33777cd1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.418097 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.418116 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4906b5ed-c663-4e81-ab33-2b8f33777cd1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.418148 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g48tj\" (UniqueName: \"kubernetes.io/projected/4906b5ed-c663-4e81-ab33-2b8f33777cd1-kube-api-access-g48tj\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.418288 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4906b5ed-c663-4e81-ab33-2b8f33777cd1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.418366 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4906b5ed-c663-4e81-ab33-2b8f33777cd1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.418447 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4906b5ed-c663-4e81-ab33-2b8f33777cd1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.418498 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4906b5ed-c663-4e81-ab33-2b8f33777cd1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.520854 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4906b5ed-c663-4e81-ab33-2b8f33777cd1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.520922 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.520943 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4906b5ed-c663-4e81-ab33-2b8f33777cd1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.520969 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g48tj\" (UniqueName: \"kubernetes.io/projected/4906b5ed-c663-4e81-ab33-2b8f33777cd1-kube-api-access-g48tj\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.520995 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4906b5ed-c663-4e81-ab33-2b8f33777cd1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.521018 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4906b5ed-c663-4e81-ab33-2b8f33777cd1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.521045 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4906b5ed-c663-4e81-ab33-2b8f33777cd1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.521069 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4906b5ed-c663-4e81-ab33-2b8f33777cd1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.521111 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4906b5ed-c663-4e81-ab33-2b8f33777cd1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.521128 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4906b5ed-c663-4e81-ab33-2b8f33777cd1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.521145 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4906b5ed-c663-4e81-ab33-2b8f33777cd1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.521422 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4906b5ed-c663-4e81-ab33-2b8f33777cd1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.521930 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4906b5ed-c663-4e81-ab33-2b8f33777cd1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.522737 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4906b5ed-c663-4e81-ab33-2b8f33777cd1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.523565 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4906b5ed-c663-4e81-ab33-2b8f33777cd1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.523969 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4906b5ed-c663-4e81-ab33-2b8f33777cd1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.524063 4720 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.526255 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4906b5ed-c663-4e81-ab33-2b8f33777cd1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.527582 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4906b5ed-c663-4e81-ab33-2b8f33777cd1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.527784 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4906b5ed-c663-4e81-ab33-2b8f33777cd1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.531980 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4906b5ed-c663-4e81-ab33-2b8f33777cd1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.542976 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g48tj\" (UniqueName: \"kubernetes.io/projected/4906b5ed-c663-4e81-ab33-2b8f33777cd1-kube-api-access-g48tj\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.553756 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.653395 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:12 crc kubenswrapper[4720]: I0121 14:52:12.169734 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 21 14:52:12 crc kubenswrapper[4720]: I0121 14:52:12.186180 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4906b5ed-c663-4e81-ab33-2b8f33777cd1","Type":"ContainerStarted","Data":"3b8bf9c12b304da22eda84d8d4aef9c2b44b6468916c9d1a4ccc32de0a4b3200"} Jan 21 14:52:12 crc kubenswrapper[4720]: I0121 14:52:12.687847 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1752995-abec-46de-adf8-da9e3ed99d4a" path="/var/lib/kubelet/pods/c1752995-abec-46de-adf8-da9e3ed99d4a/volumes" Jan 21 14:52:13 crc kubenswrapper[4720]: I0121 14:52:13.200385 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f73dd82b-9ad1-4deb-b244-6d42a3f25f89","Type":"ContainerStarted","Data":"79b3aabb4928f6b631f8bd790a70c6b51e6a763cb8fc8dcce474163ba33400ba"} Jan 21 14:52:14 crc kubenswrapper[4720]: I0121 14:52:14.211504 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4906b5ed-c663-4e81-ab33-2b8f33777cd1","Type":"ContainerStarted","Data":"18cfbfd34acd13af24c666d9d7a73718d1d3050e2f8b8e7529a2094d7947823e"} Jan 21 14:52:14 crc kubenswrapper[4720]: I0121 14:52:14.923222 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-2dxkf"] Jan 21 14:52:14 crc kubenswrapper[4720]: I0121 14:52:14.924853 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:14 crc kubenswrapper[4720]: I0121 14:52:14.926775 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Jan 21 14:52:14 crc kubenswrapper[4720]: I0121 14:52:14.945964 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-2dxkf"] Jan 21 14:52:15 crc kubenswrapper[4720]: I0121 14:52:15.001300 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-2dxkf\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:15 crc kubenswrapper[4720]: I0121 14:52:15.001352 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-2dxkf\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:15 crc kubenswrapper[4720]: I0121 14:52:15.001411 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-config\") pod \"dnsmasq-dns-6447ccbd8f-2dxkf\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:15 crc kubenswrapper[4720]: I0121 14:52:15.001437 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptkth\" (UniqueName: \"kubernetes.io/projected/532b4122-14d5-4dd2-84e5-f08c72a5c34e-kube-api-access-ptkth\") pod \"dnsmasq-dns-6447ccbd8f-2dxkf\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:15 crc kubenswrapper[4720]: I0121 14:52:15.001495 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-2dxkf\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:15 crc kubenswrapper[4720]: I0121 14:52:15.001596 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-2dxkf\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:15 crc kubenswrapper[4720]: I0121 14:52:15.102881 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-2dxkf\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:15 crc kubenswrapper[4720]: I0121 14:52:15.102931 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-2dxkf\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:15 crc kubenswrapper[4720]: I0121 14:52:15.102954 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-2dxkf\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:15 crc kubenswrapper[4720]: I0121 14:52:15.102996 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-config\") pod \"dnsmasq-dns-6447ccbd8f-2dxkf\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:15 crc kubenswrapper[4720]: I0121 14:52:15.103020 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptkth\" (UniqueName: \"kubernetes.io/projected/532b4122-14d5-4dd2-84e5-f08c72a5c34e-kube-api-access-ptkth\") pod \"dnsmasq-dns-6447ccbd8f-2dxkf\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:15 crc kubenswrapper[4720]: I0121 14:52:15.103071 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-2dxkf\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:15 crc kubenswrapper[4720]: I0121 14:52:15.103840 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-2dxkf\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:15 crc kubenswrapper[4720]: I0121 14:52:15.104366 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-2dxkf\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:15 crc kubenswrapper[4720]: I0121 14:52:15.105282 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-2dxkf\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:15 crc kubenswrapper[4720]: I0121 14:52:15.105819 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-2dxkf\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:15 crc kubenswrapper[4720]: I0121 14:52:15.106307 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-config\") pod \"dnsmasq-dns-6447ccbd8f-2dxkf\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:15 crc kubenswrapper[4720]: I0121 14:52:15.125284 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptkth\" (UniqueName: \"kubernetes.io/projected/532b4122-14d5-4dd2-84e5-f08c72a5c34e-kube-api-access-ptkth\") pod \"dnsmasq-dns-6447ccbd8f-2dxkf\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:15 crc kubenswrapper[4720]: I0121 14:52:15.244373 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:15 crc kubenswrapper[4720]: I0121 14:52:15.692419 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-2dxkf"] Jan 21 14:52:16 crc kubenswrapper[4720]: I0121 14:52:16.226612 4720 generic.go:334] "Generic (PLEG): container finished" podID="532b4122-14d5-4dd2-84e5-f08c72a5c34e" containerID="6b2d92d59d9fbd6ef4876eb39419107419a776f3afb8f7e157056e0cac869cb8" exitCode=0 Jan 21 14:52:16 crc kubenswrapper[4720]: I0121 14:52:16.226672 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" event={"ID":"532b4122-14d5-4dd2-84e5-f08c72a5c34e","Type":"ContainerDied","Data":"6b2d92d59d9fbd6ef4876eb39419107419a776f3afb8f7e157056e0cac869cb8"} Jan 21 14:52:16 crc kubenswrapper[4720]: I0121 14:52:16.226959 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" event={"ID":"532b4122-14d5-4dd2-84e5-f08c72a5c34e","Type":"ContainerStarted","Data":"116479b2273f15b257f5cd3bbc45cad56003eb6d06ec69e9ecf37fc87ad84fef"} Jan 21 14:52:17 crc kubenswrapper[4720]: I0121 14:52:17.236250 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" event={"ID":"532b4122-14d5-4dd2-84e5-f08c72a5c34e","Type":"ContainerStarted","Data":"f9191da53845c4fb5f27b24bb54bbfc18c78bad73b8451c22c1b2afb7d84f650"} Jan 21 14:52:17 crc kubenswrapper[4720]: I0121 14:52:17.237689 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:17 crc kubenswrapper[4720]: I0121 14:52:17.263273 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" podStartSLOduration=3.263254895 podStartE2EDuration="3.263254895s" podCreationTimestamp="2026-01-21 14:52:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:52:17.25822392 +0000 UTC m=+1375.166963852" watchObservedRunningTime="2026-01-21 14:52:17.263254895 +0000 UTC m=+1375.171994827" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.245945 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.354728 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-9p6zm"] Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.355075 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" podUID="75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6" containerName="dnsmasq-dns" containerID="cri-o://1715d91e262791470c33fe547bc682331a4067f0c9ab327b2b7d0a3b75411203" gracePeriod=10 Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.476392 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" podUID="75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.183:5353: connect: connection refused" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.565052 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c5d8cf46f-bgxfr"] Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.572798 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.580093 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c5d8cf46f-bgxfr"] Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.722519 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/248ea464-73a3-4083-bb27-fc2cb7347224-ovsdbserver-sb\") pod \"dnsmasq-dns-6c5d8cf46f-bgxfr\" (UID: \"248ea464-73a3-4083-bb27-fc2cb7347224\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.722631 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfrvp\" (UniqueName: \"kubernetes.io/projected/248ea464-73a3-4083-bb27-fc2cb7347224-kube-api-access-xfrvp\") pod \"dnsmasq-dns-6c5d8cf46f-bgxfr\" (UID: \"248ea464-73a3-4083-bb27-fc2cb7347224\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.722693 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/248ea464-73a3-4083-bb27-fc2cb7347224-config\") pod \"dnsmasq-dns-6c5d8cf46f-bgxfr\" (UID: \"248ea464-73a3-4083-bb27-fc2cb7347224\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.722731 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/248ea464-73a3-4083-bb27-fc2cb7347224-openstack-edpm-ipam\") pod \"dnsmasq-dns-6c5d8cf46f-bgxfr\" (UID: \"248ea464-73a3-4083-bb27-fc2cb7347224\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.722764 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/248ea464-73a3-4083-bb27-fc2cb7347224-dns-svc\") pod \"dnsmasq-dns-6c5d8cf46f-bgxfr\" (UID: \"248ea464-73a3-4083-bb27-fc2cb7347224\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.722820 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/248ea464-73a3-4083-bb27-fc2cb7347224-ovsdbserver-nb\") pod \"dnsmasq-dns-6c5d8cf46f-bgxfr\" (UID: \"248ea464-73a3-4083-bb27-fc2cb7347224\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.824817 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/248ea464-73a3-4083-bb27-fc2cb7347224-dns-svc\") pod \"dnsmasq-dns-6c5d8cf46f-bgxfr\" (UID: \"248ea464-73a3-4083-bb27-fc2cb7347224\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.824905 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/248ea464-73a3-4083-bb27-fc2cb7347224-ovsdbserver-nb\") pod \"dnsmasq-dns-6c5d8cf46f-bgxfr\" (UID: \"248ea464-73a3-4083-bb27-fc2cb7347224\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.824990 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/248ea464-73a3-4083-bb27-fc2cb7347224-ovsdbserver-sb\") pod \"dnsmasq-dns-6c5d8cf46f-bgxfr\" (UID: \"248ea464-73a3-4083-bb27-fc2cb7347224\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.825200 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfrvp\" (UniqueName: \"kubernetes.io/projected/248ea464-73a3-4083-bb27-fc2cb7347224-kube-api-access-xfrvp\") pod \"dnsmasq-dns-6c5d8cf46f-bgxfr\" (UID: \"248ea464-73a3-4083-bb27-fc2cb7347224\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.825314 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/248ea464-73a3-4083-bb27-fc2cb7347224-config\") pod \"dnsmasq-dns-6c5d8cf46f-bgxfr\" (UID: \"248ea464-73a3-4083-bb27-fc2cb7347224\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.825400 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/248ea464-73a3-4083-bb27-fc2cb7347224-openstack-edpm-ipam\") pod \"dnsmasq-dns-6c5d8cf46f-bgxfr\" (UID: \"248ea464-73a3-4083-bb27-fc2cb7347224\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.826393 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/248ea464-73a3-4083-bb27-fc2cb7347224-ovsdbserver-sb\") pod \"dnsmasq-dns-6c5d8cf46f-bgxfr\" (UID: \"248ea464-73a3-4083-bb27-fc2cb7347224\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.826699 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/248ea464-73a3-4083-bb27-fc2cb7347224-openstack-edpm-ipam\") pod \"dnsmasq-dns-6c5d8cf46f-bgxfr\" (UID: \"248ea464-73a3-4083-bb27-fc2cb7347224\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.827099 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/248ea464-73a3-4083-bb27-fc2cb7347224-config\") pod \"dnsmasq-dns-6c5d8cf46f-bgxfr\" (UID: \"248ea464-73a3-4083-bb27-fc2cb7347224\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.827574 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/248ea464-73a3-4083-bb27-fc2cb7347224-ovsdbserver-nb\") pod \"dnsmasq-dns-6c5d8cf46f-bgxfr\" (UID: \"248ea464-73a3-4083-bb27-fc2cb7347224\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.827773 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/248ea464-73a3-4083-bb27-fc2cb7347224-dns-svc\") pod \"dnsmasq-dns-6c5d8cf46f-bgxfr\" (UID: \"248ea464-73a3-4083-bb27-fc2cb7347224\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.847301 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfrvp\" (UniqueName: \"kubernetes.io/projected/248ea464-73a3-4083-bb27-fc2cb7347224-kube-api-access-xfrvp\") pod \"dnsmasq-dns-6c5d8cf46f-bgxfr\" (UID: \"248ea464-73a3-4083-bb27-fc2cb7347224\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.908969 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.925065 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.033135 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-dns-svc\") pod \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\" (UID: \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\") " Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.033423 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlfx2\" (UniqueName: \"kubernetes.io/projected/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-kube-api-access-wlfx2\") pod \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\" (UID: \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\") " Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.033445 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-ovsdbserver-nb\") pod \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\" (UID: \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\") " Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.033530 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-ovsdbserver-sb\") pod \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\" (UID: \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\") " Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.033552 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-config\") pod \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\" (UID: \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\") " Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.042151 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-kube-api-access-wlfx2" (OuterVolumeSpecName: "kube-api-access-wlfx2") pod "75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6" (UID: "75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6"). InnerVolumeSpecName "kube-api-access-wlfx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.084648 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-config" (OuterVolumeSpecName: "config") pod "75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6" (UID: "75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.101952 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6" (UID: "75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.115467 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6" (UID: "75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.129735 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6" (UID: "75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.136833 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlfx2\" (UniqueName: \"kubernetes.io/projected/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-kube-api-access-wlfx2\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.136873 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.136884 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.136897 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.136908 4720 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.325105 4720 generic.go:334] "Generic (PLEG): container finished" podID="75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6" containerID="1715d91e262791470c33fe547bc682331a4067f0c9ab327b2b7d0a3b75411203" exitCode=0 Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.325160 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" event={"ID":"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6","Type":"ContainerDied","Data":"1715d91e262791470c33fe547bc682331a4067f0c9ab327b2b7d0a3b75411203"} Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.325190 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" event={"ID":"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6","Type":"ContainerDied","Data":"62714c9a0f1a3b425c21ab81569bf9c4c0ba1448aea15537467fba81fe36bdf5"} Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.325210 4720 scope.go:117] "RemoveContainer" containerID="1715d91e262791470c33fe547bc682331a4067f0c9ab327b2b7d0a3b75411203" Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.325381 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.363248 4720 scope.go:117] "RemoveContainer" containerID="8d26a926c7513c277eda02616206b8d63ce1e5af78c608a35fda9052efcedc61" Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.376884 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-9p6zm"] Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.391353 4720 scope.go:117] "RemoveContainer" containerID="1715d91e262791470c33fe547bc682331a4067f0c9ab327b2b7d0a3b75411203" Jan 21 14:52:26 crc kubenswrapper[4720]: E0121 14:52:26.392826 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1715d91e262791470c33fe547bc682331a4067f0c9ab327b2b7d0a3b75411203\": container with ID starting with 1715d91e262791470c33fe547bc682331a4067f0c9ab327b2b7d0a3b75411203 not found: ID does not exist" containerID="1715d91e262791470c33fe547bc682331a4067f0c9ab327b2b7d0a3b75411203" Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.392881 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1715d91e262791470c33fe547bc682331a4067f0c9ab327b2b7d0a3b75411203"} err="failed to get container status \"1715d91e262791470c33fe547bc682331a4067f0c9ab327b2b7d0a3b75411203\": rpc error: code = NotFound desc = could not find container \"1715d91e262791470c33fe547bc682331a4067f0c9ab327b2b7d0a3b75411203\": container with ID starting with 1715d91e262791470c33fe547bc682331a4067f0c9ab327b2b7d0a3b75411203 not found: ID does not exist" Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.392915 4720 scope.go:117] "RemoveContainer" containerID="8d26a926c7513c277eda02616206b8d63ce1e5af78c608a35fda9052efcedc61" Jan 21 14:52:26 crc kubenswrapper[4720]: E0121 14:52:26.393832 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d26a926c7513c277eda02616206b8d63ce1e5af78c608a35fda9052efcedc61\": container with ID starting with 8d26a926c7513c277eda02616206b8d63ce1e5af78c608a35fda9052efcedc61 not found: ID does not exist" containerID="8d26a926c7513c277eda02616206b8d63ce1e5af78c608a35fda9052efcedc61" Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.393867 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d26a926c7513c277eda02616206b8d63ce1e5af78c608a35fda9052efcedc61"} err="failed to get container status \"8d26a926c7513c277eda02616206b8d63ce1e5af78c608a35fda9052efcedc61\": rpc error: code = NotFound desc = could not find container \"8d26a926c7513c277eda02616206b8d63ce1e5af78c608a35fda9052efcedc61\": container with ID starting with 8d26a926c7513c277eda02616206b8d63ce1e5af78c608a35fda9052efcedc61 not found: ID does not exist" Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.395577 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-9p6zm"] Jan 21 14:52:26 crc kubenswrapper[4720]: W0121 14:52:26.400417 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod248ea464_73a3_4083_bb27_fc2cb7347224.slice/crio-98161d0e74643b7c2623919d017c9064aa341518c103b5a43586aecba7de3d76 WatchSource:0}: Error finding container 98161d0e74643b7c2623919d017c9064aa341518c103b5a43586aecba7de3d76: Status 404 returned error can't find the container with id 98161d0e74643b7c2623919d017c9064aa341518c103b5a43586aecba7de3d76 Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.403645 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c5d8cf46f-bgxfr"] Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.689227 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6" path="/var/lib/kubelet/pods/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6/volumes" Jan 21 14:52:26 crc kubenswrapper[4720]: E0121 14:52:26.894130 4720 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod248ea464_73a3_4083_bb27_fc2cb7347224.slice/crio-a1cd28face0b16a480d00b0be1619baabea48ad080302e25648ffac49d0afe85.scope\": RecentStats: unable to find data in memory cache]" Jan 21 14:52:27 crc kubenswrapper[4720]: I0121 14:52:27.337374 4720 generic.go:334] "Generic (PLEG): container finished" podID="248ea464-73a3-4083-bb27-fc2cb7347224" containerID="a1cd28face0b16a480d00b0be1619baabea48ad080302e25648ffac49d0afe85" exitCode=0 Jan 21 14:52:27 crc kubenswrapper[4720]: I0121 14:52:27.337423 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" event={"ID":"248ea464-73a3-4083-bb27-fc2cb7347224","Type":"ContainerDied","Data":"a1cd28face0b16a480d00b0be1619baabea48ad080302e25648ffac49d0afe85"} Jan 21 14:52:27 crc kubenswrapper[4720]: I0121 14:52:27.337450 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" event={"ID":"248ea464-73a3-4083-bb27-fc2cb7347224","Type":"ContainerStarted","Data":"98161d0e74643b7c2623919d017c9064aa341518c103b5a43586aecba7de3d76"} Jan 21 14:52:28 crc kubenswrapper[4720]: I0121 14:52:28.348199 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" event={"ID":"248ea464-73a3-4083-bb27-fc2cb7347224","Type":"ContainerStarted","Data":"bcf89378c96db8e2b2f7b382806c80c79e5afed950d542ae9620d8e1972262de"} Jan 21 14:52:28 crc kubenswrapper[4720]: I0121 14:52:28.349560 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:28 crc kubenswrapper[4720]: I0121 14:52:28.372603 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" podStartSLOduration=3.3725707209999998 podStartE2EDuration="3.372570721s" podCreationTimestamp="2026-01-21 14:52:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:52:28.364515857 +0000 UTC m=+1386.273255809" watchObservedRunningTime="2026-01-21 14:52:28.372570721 +0000 UTC m=+1386.281310653" Jan 21 14:52:35 crc kubenswrapper[4720]: I0121 14:52:35.910954 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:35 crc kubenswrapper[4720]: I0121 14:52:35.995963 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-2dxkf"] Jan 21 14:52:35 crc kubenswrapper[4720]: I0121 14:52:35.996226 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" podUID="532b4122-14d5-4dd2-84e5-f08c72a5c34e" containerName="dnsmasq-dns" containerID="cri-o://f9191da53845c4fb5f27b24bb54bbfc18c78bad73b8451c22c1b2afb7d84f650" gracePeriod=10 Jan 21 14:52:36 crc kubenswrapper[4720]: I0121 14:52:36.251374 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5cdm8"] Jan 21 14:52:36 crc kubenswrapper[4720]: E0121 14:52:36.251810 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6" containerName="init" Jan 21 14:52:36 crc kubenswrapper[4720]: I0121 14:52:36.251825 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6" containerName="init" Jan 21 14:52:36 crc kubenswrapper[4720]: E0121 14:52:36.251841 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6" containerName="dnsmasq-dns" Jan 21 14:52:36 crc kubenswrapper[4720]: I0121 14:52:36.251849 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6" containerName="dnsmasq-dns" Jan 21 14:52:36 crc kubenswrapper[4720]: I0121 14:52:36.252099 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6" containerName="dnsmasq-dns" Jan 21 14:52:36 crc kubenswrapper[4720]: I0121 14:52:36.253561 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5cdm8" Jan 21 14:52:36 crc kubenswrapper[4720]: I0121 14:52:36.279537 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5cdm8"] Jan 21 14:52:36 crc kubenswrapper[4720]: I0121 14:52:36.321869 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35639e0c-f3bb-48c7-9879-442aff2fcdbc-utilities\") pod \"community-operators-5cdm8\" (UID: \"35639e0c-f3bb-48c7-9879-442aff2fcdbc\") " pod="openshift-marketplace/community-operators-5cdm8" Jan 21 14:52:36 crc kubenswrapper[4720]: I0121 14:52:36.321910 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35639e0c-f3bb-48c7-9879-442aff2fcdbc-catalog-content\") pod \"community-operators-5cdm8\" (UID: \"35639e0c-f3bb-48c7-9879-442aff2fcdbc\") " pod="openshift-marketplace/community-operators-5cdm8" Jan 21 14:52:36 crc kubenswrapper[4720]: I0121 14:52:36.322037 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8hh6\" (UniqueName: \"kubernetes.io/projected/35639e0c-f3bb-48c7-9879-442aff2fcdbc-kube-api-access-g8hh6\") pod \"community-operators-5cdm8\" (UID: \"35639e0c-f3bb-48c7-9879-442aff2fcdbc\") " pod="openshift-marketplace/community-operators-5cdm8" Jan 21 14:52:36 crc kubenswrapper[4720]: I0121 14:52:36.416363 4720 generic.go:334] "Generic (PLEG): container finished" podID="532b4122-14d5-4dd2-84e5-f08c72a5c34e" containerID="f9191da53845c4fb5f27b24bb54bbfc18c78bad73b8451c22c1b2afb7d84f650" exitCode=0 Jan 21 14:52:36 crc kubenswrapper[4720]: I0121 14:52:36.416411 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" event={"ID":"532b4122-14d5-4dd2-84e5-f08c72a5c34e","Type":"ContainerDied","Data":"f9191da53845c4fb5f27b24bb54bbfc18c78bad73b8451c22c1b2afb7d84f650"} Jan 21 14:52:36 crc kubenswrapper[4720]: I0121 14:52:36.423925 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8hh6\" (UniqueName: \"kubernetes.io/projected/35639e0c-f3bb-48c7-9879-442aff2fcdbc-kube-api-access-g8hh6\") pod \"community-operators-5cdm8\" (UID: \"35639e0c-f3bb-48c7-9879-442aff2fcdbc\") " pod="openshift-marketplace/community-operators-5cdm8" Jan 21 14:52:36 crc kubenswrapper[4720]: I0121 14:52:36.424051 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35639e0c-f3bb-48c7-9879-442aff2fcdbc-utilities\") pod \"community-operators-5cdm8\" (UID: \"35639e0c-f3bb-48c7-9879-442aff2fcdbc\") " pod="openshift-marketplace/community-operators-5cdm8" Jan 21 14:52:36 crc kubenswrapper[4720]: I0121 14:52:36.424076 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35639e0c-f3bb-48c7-9879-442aff2fcdbc-catalog-content\") pod \"community-operators-5cdm8\" (UID: \"35639e0c-f3bb-48c7-9879-442aff2fcdbc\") " pod="openshift-marketplace/community-operators-5cdm8" Jan 21 14:52:36 crc kubenswrapper[4720]: I0121 14:52:36.424623 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35639e0c-f3bb-48c7-9879-442aff2fcdbc-catalog-content\") pod \"community-operators-5cdm8\" (UID: \"35639e0c-f3bb-48c7-9879-442aff2fcdbc\") " pod="openshift-marketplace/community-operators-5cdm8" Jan 21 14:52:36 crc kubenswrapper[4720]: I0121 14:52:36.424745 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35639e0c-f3bb-48c7-9879-442aff2fcdbc-utilities\") pod \"community-operators-5cdm8\" (UID: \"35639e0c-f3bb-48c7-9879-442aff2fcdbc\") " pod="openshift-marketplace/community-operators-5cdm8" Jan 21 14:52:36 crc kubenswrapper[4720]: I0121 14:52:36.443698 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8hh6\" (UniqueName: \"kubernetes.io/projected/35639e0c-f3bb-48c7-9879-442aff2fcdbc-kube-api-access-g8hh6\") pod \"community-operators-5cdm8\" (UID: \"35639e0c-f3bb-48c7-9879-442aff2fcdbc\") " pod="openshift-marketplace/community-operators-5cdm8" Jan 21 14:52:36 crc kubenswrapper[4720]: I0121 14:52:36.593496 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5cdm8" Jan 21 14:52:37 crc kubenswrapper[4720]: W0121 14:52:37.351862 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35639e0c_f3bb_48c7_9879_442aff2fcdbc.slice/crio-e36a7ccb75bfaef518302425aaa5615316e673c41c0f4ff8d85f9079db99588e WatchSource:0}: Error finding container e36a7ccb75bfaef518302425aaa5615316e673c41c0f4ff8d85f9079db99588e: Status 404 returned error can't find the container with id e36a7ccb75bfaef518302425aaa5615316e673c41c0f4ff8d85f9079db99588e Jan 21 14:52:37 crc kubenswrapper[4720]: I0121 14:52:37.356044 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5cdm8"] Jan 21 14:52:37 crc kubenswrapper[4720]: I0121 14:52:37.430882 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5cdm8" event={"ID":"35639e0c-f3bb-48c7-9879-442aff2fcdbc","Type":"ContainerStarted","Data":"e36a7ccb75bfaef518302425aaa5615316e673c41c0f4ff8d85f9079db99588e"} Jan 21 14:52:37 crc kubenswrapper[4720]: I0121 14:52:37.712171 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:37 crc kubenswrapper[4720]: I0121 14:52:37.849578 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-config\") pod \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " Jan 21 14:52:37 crc kubenswrapper[4720]: I0121 14:52:37.849672 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-ovsdbserver-sb\") pod \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " Jan 21 14:52:37 crc kubenswrapper[4720]: I0121 14:52:37.849726 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-ovsdbserver-nb\") pod \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " Jan 21 14:52:37 crc kubenswrapper[4720]: I0121 14:52:37.849789 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-openstack-edpm-ipam\") pod \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " Jan 21 14:52:37 crc kubenswrapper[4720]: I0121 14:52:37.850416 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-dns-svc\") pod \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " Jan 21 14:52:37 crc kubenswrapper[4720]: I0121 14:52:37.850904 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptkth\" (UniqueName: \"kubernetes.io/projected/532b4122-14d5-4dd2-84e5-f08c72a5c34e-kube-api-access-ptkth\") pod \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " Jan 21 14:52:37 crc kubenswrapper[4720]: I0121 14:52:37.903861 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/532b4122-14d5-4dd2-84e5-f08c72a5c34e-kube-api-access-ptkth" (OuterVolumeSpecName: "kube-api-access-ptkth") pod "532b4122-14d5-4dd2-84e5-f08c72a5c34e" (UID: "532b4122-14d5-4dd2-84e5-f08c72a5c34e"). InnerVolumeSpecName "kube-api-access-ptkth". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:52:37 crc kubenswrapper[4720]: I0121 14:52:37.953205 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptkth\" (UniqueName: \"kubernetes.io/projected/532b4122-14d5-4dd2-84e5-f08c72a5c34e-kube-api-access-ptkth\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:37 crc kubenswrapper[4720]: I0121 14:52:37.969368 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "532b4122-14d5-4dd2-84e5-f08c72a5c34e" (UID: "532b4122-14d5-4dd2-84e5-f08c72a5c34e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:52:37 crc kubenswrapper[4720]: I0121 14:52:37.990537 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "532b4122-14d5-4dd2-84e5-f08c72a5c34e" (UID: "532b4122-14d5-4dd2-84e5-f08c72a5c34e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:52:37 crc kubenswrapper[4720]: I0121 14:52:37.990737 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "532b4122-14d5-4dd2-84e5-f08c72a5c34e" (UID: "532b4122-14d5-4dd2-84e5-f08c72a5c34e"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:52:37 crc kubenswrapper[4720]: I0121 14:52:37.990975 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-config" (OuterVolumeSpecName: "config") pod "532b4122-14d5-4dd2-84e5-f08c72a5c34e" (UID: "532b4122-14d5-4dd2-84e5-f08c72a5c34e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:52:38 crc kubenswrapper[4720]: I0121 14:52:38.004823 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "532b4122-14d5-4dd2-84e5-f08c72a5c34e" (UID: "532b4122-14d5-4dd2-84e5-f08c72a5c34e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:52:38 crc kubenswrapper[4720]: I0121 14:52:38.054608 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:38 crc kubenswrapper[4720]: I0121 14:52:38.054689 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:38 crc kubenswrapper[4720]: I0121 14:52:38.054699 4720 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:38 crc kubenswrapper[4720]: I0121 14:52:38.054709 4720 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:38 crc kubenswrapper[4720]: I0121 14:52:38.054716 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:38 crc kubenswrapper[4720]: I0121 14:52:38.460380 4720 generic.go:334] "Generic (PLEG): container finished" podID="35639e0c-f3bb-48c7-9879-442aff2fcdbc" containerID="514634cf38bfc52fe83d46810491a641041e45c3b3ded3080688a74aa08d5e9e" exitCode=0 Jan 21 14:52:38 crc kubenswrapper[4720]: I0121 14:52:38.460505 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5cdm8" event={"ID":"35639e0c-f3bb-48c7-9879-442aff2fcdbc","Type":"ContainerDied","Data":"514634cf38bfc52fe83d46810491a641041e45c3b3ded3080688a74aa08d5e9e"} Jan 21 14:52:38 crc kubenswrapper[4720]: I0121 14:52:38.465792 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" event={"ID":"532b4122-14d5-4dd2-84e5-f08c72a5c34e","Type":"ContainerDied","Data":"116479b2273f15b257f5cd3bbc45cad56003eb6d06ec69e9ecf37fc87ad84fef"} Jan 21 14:52:38 crc kubenswrapper[4720]: I0121 14:52:38.465843 4720 scope.go:117] "RemoveContainer" containerID="f9191da53845c4fb5f27b24bb54bbfc18c78bad73b8451c22c1b2afb7d84f650" Jan 21 14:52:38 crc kubenswrapper[4720]: I0121 14:52:38.465857 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:38 crc kubenswrapper[4720]: I0121 14:52:38.501629 4720 scope.go:117] "RemoveContainer" containerID="6b2d92d59d9fbd6ef4876eb39419107419a776f3afb8f7e157056e0cac869cb8" Jan 21 14:52:38 crc kubenswrapper[4720]: I0121 14:52:38.508879 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-2dxkf"] Jan 21 14:52:38 crc kubenswrapper[4720]: I0121 14:52:38.516514 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-2dxkf"] Jan 21 14:52:38 crc kubenswrapper[4720]: I0121 14:52:38.691936 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="532b4122-14d5-4dd2-84e5-f08c72a5c34e" path="/var/lib/kubelet/pods/532b4122-14d5-4dd2-84e5-f08c72a5c34e/volumes" Jan 21 14:52:40 crc kubenswrapper[4720]: I0121 14:52:40.486064 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5cdm8" event={"ID":"35639e0c-f3bb-48c7-9879-442aff2fcdbc","Type":"ContainerStarted","Data":"eb22a6e0337b0f3e36f438e6a72843a04dd1a8862e6d1d19df010bf8b20418ee"} Jan 21 14:52:41 crc kubenswrapper[4720]: I0121 14:52:41.499157 4720 generic.go:334] "Generic (PLEG): container finished" podID="35639e0c-f3bb-48c7-9879-442aff2fcdbc" containerID="eb22a6e0337b0f3e36f438e6a72843a04dd1a8862e6d1d19df010bf8b20418ee" exitCode=0 Jan 21 14:52:41 crc kubenswrapper[4720]: I0121 14:52:41.499233 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5cdm8" event={"ID":"35639e0c-f3bb-48c7-9879-442aff2fcdbc","Type":"ContainerDied","Data":"eb22a6e0337b0f3e36f438e6a72843a04dd1a8862e6d1d19df010bf8b20418ee"} Jan 21 14:52:41 crc kubenswrapper[4720]: I0121 14:52:41.888699 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt"] Jan 21 14:52:41 crc kubenswrapper[4720]: E0121 14:52:41.889116 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="532b4122-14d5-4dd2-84e5-f08c72a5c34e" containerName="init" Jan 21 14:52:41 crc kubenswrapper[4720]: I0121 14:52:41.889134 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="532b4122-14d5-4dd2-84e5-f08c72a5c34e" containerName="init" Jan 21 14:52:41 crc kubenswrapper[4720]: E0121 14:52:41.889155 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="532b4122-14d5-4dd2-84e5-f08c72a5c34e" containerName="dnsmasq-dns" Jan 21 14:52:41 crc kubenswrapper[4720]: I0121 14:52:41.889164 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="532b4122-14d5-4dd2-84e5-f08c72a5c34e" containerName="dnsmasq-dns" Jan 21 14:52:41 crc kubenswrapper[4720]: I0121 14:52:41.889345 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="532b4122-14d5-4dd2-84e5-f08c72a5c34e" containerName="dnsmasq-dns" Jan 21 14:52:41 crc kubenswrapper[4720]: I0121 14:52:41.889891 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" Jan 21 14:52:41 crc kubenswrapper[4720]: I0121 14:52:41.894941 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-q5rkp" Jan 21 14:52:41 crc kubenswrapper[4720]: I0121 14:52:41.895393 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 14:52:41 crc kubenswrapper[4720]: I0121 14:52:41.895685 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 14:52:41 crc kubenswrapper[4720]: I0121 14:52:41.895975 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 14:52:41 crc kubenswrapper[4720]: I0121 14:52:41.919277 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt"] Jan 21 14:52:41 crc kubenswrapper[4720]: I0121 14:52:41.928621 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0506243d-6216-4541-8f14-8b2c2beb409b-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt\" (UID: \"0506243d-6216-4541-8f14-8b2c2beb409b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" Jan 21 14:52:41 crc kubenswrapper[4720]: I0121 14:52:41.928924 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4ff4\" (UniqueName: \"kubernetes.io/projected/0506243d-6216-4541-8f14-8b2c2beb409b-kube-api-access-z4ff4\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt\" (UID: \"0506243d-6216-4541-8f14-8b2c2beb409b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" Jan 21 14:52:41 crc kubenswrapper[4720]: I0121 14:52:41.929073 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0506243d-6216-4541-8f14-8b2c2beb409b-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt\" (UID: \"0506243d-6216-4541-8f14-8b2c2beb409b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" Jan 21 14:52:41 crc kubenswrapper[4720]: I0121 14:52:41.929280 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0506243d-6216-4541-8f14-8b2c2beb409b-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt\" (UID: \"0506243d-6216-4541-8f14-8b2c2beb409b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" Jan 21 14:52:42 crc kubenswrapper[4720]: I0121 14:52:42.031580 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0506243d-6216-4541-8f14-8b2c2beb409b-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt\" (UID: \"0506243d-6216-4541-8f14-8b2c2beb409b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" Jan 21 14:52:42 crc kubenswrapper[4720]: I0121 14:52:42.031975 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4ff4\" (UniqueName: \"kubernetes.io/projected/0506243d-6216-4541-8f14-8b2c2beb409b-kube-api-access-z4ff4\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt\" (UID: \"0506243d-6216-4541-8f14-8b2c2beb409b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" Jan 21 14:52:42 crc kubenswrapper[4720]: I0121 14:52:42.032001 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0506243d-6216-4541-8f14-8b2c2beb409b-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt\" (UID: \"0506243d-6216-4541-8f14-8b2c2beb409b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" Jan 21 14:52:42 crc kubenswrapper[4720]: I0121 14:52:42.032109 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0506243d-6216-4541-8f14-8b2c2beb409b-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt\" (UID: \"0506243d-6216-4541-8f14-8b2c2beb409b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" Jan 21 14:52:42 crc kubenswrapper[4720]: I0121 14:52:42.037326 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0506243d-6216-4541-8f14-8b2c2beb409b-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt\" (UID: \"0506243d-6216-4541-8f14-8b2c2beb409b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" Jan 21 14:52:42 crc kubenswrapper[4720]: I0121 14:52:42.037463 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0506243d-6216-4541-8f14-8b2c2beb409b-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt\" (UID: \"0506243d-6216-4541-8f14-8b2c2beb409b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" Jan 21 14:52:42 crc kubenswrapper[4720]: I0121 14:52:42.037715 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0506243d-6216-4541-8f14-8b2c2beb409b-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt\" (UID: \"0506243d-6216-4541-8f14-8b2c2beb409b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" Jan 21 14:52:42 crc kubenswrapper[4720]: I0121 14:52:42.058505 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4ff4\" (UniqueName: \"kubernetes.io/projected/0506243d-6216-4541-8f14-8b2c2beb409b-kube-api-access-z4ff4\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt\" (UID: \"0506243d-6216-4541-8f14-8b2c2beb409b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" Jan 21 14:52:42 crc kubenswrapper[4720]: I0121 14:52:42.213267 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" Jan 21 14:52:42 crc kubenswrapper[4720]: I0121 14:52:42.510217 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5cdm8" event={"ID":"35639e0c-f3bb-48c7-9879-442aff2fcdbc","Type":"ContainerStarted","Data":"b7a20252450813cdcecb54726afdb172a34e5c4adc5f07b8452b3e8875d4e007"} Jan 21 14:52:42 crc kubenswrapper[4720]: I0121 14:52:42.536743 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5cdm8" podStartSLOduration=2.95468241 podStartE2EDuration="6.536721704s" podCreationTimestamp="2026-01-21 14:52:36 +0000 UTC" firstStartedPulling="2026-01-21 14:52:38.462279323 +0000 UTC m=+1396.371019255" lastFinishedPulling="2026-01-21 14:52:42.044318617 +0000 UTC m=+1399.953058549" observedRunningTime="2026-01-21 14:52:42.527043913 +0000 UTC m=+1400.435783865" watchObservedRunningTime="2026-01-21 14:52:42.536721704 +0000 UTC m=+1400.445461636" Jan 21 14:52:42 crc kubenswrapper[4720]: W0121 14:52:42.742061 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0506243d_6216_4541_8f14_8b2c2beb409b.slice/crio-de902a439b17ff2cb12404864917c3124a0c9d6ef8eb23dc1bc69b8a1f6ee5bc WatchSource:0}: Error finding container de902a439b17ff2cb12404864917c3124a0c9d6ef8eb23dc1bc69b8a1f6ee5bc: Status 404 returned error can't find the container with id de902a439b17ff2cb12404864917c3124a0c9d6ef8eb23dc1bc69b8a1f6ee5bc Jan 21 14:52:42 crc kubenswrapper[4720]: I0121 14:52:42.753587 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt"] Jan 21 14:52:43 crc kubenswrapper[4720]: I0121 14:52:43.519568 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" event={"ID":"0506243d-6216-4541-8f14-8b2c2beb409b","Type":"ContainerStarted","Data":"de902a439b17ff2cb12404864917c3124a0c9d6ef8eb23dc1bc69b8a1f6ee5bc"} Jan 21 14:52:45 crc kubenswrapper[4720]: I0121 14:52:45.542826 4720 generic.go:334] "Generic (PLEG): container finished" podID="f73dd82b-9ad1-4deb-b244-6d42a3f25f89" containerID="79b3aabb4928f6b631f8bd790a70c6b51e6a763cb8fc8dcce474163ba33400ba" exitCode=0 Jan 21 14:52:45 crc kubenswrapper[4720]: I0121 14:52:45.542966 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f73dd82b-9ad1-4deb-b244-6d42a3f25f89","Type":"ContainerDied","Data":"79b3aabb4928f6b631f8bd790a70c6b51e6a763cb8fc8dcce474163ba33400ba"} Jan 21 14:52:46 crc kubenswrapper[4720]: I0121 14:52:46.553515 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f73dd82b-9ad1-4deb-b244-6d42a3f25f89","Type":"ContainerStarted","Data":"ab540f557245b3bfdd0325f392688f0c5633f332b617739e086b14de929c49c9"} Jan 21 14:52:46 crc kubenswrapper[4720]: I0121 14:52:46.554036 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 21 14:52:46 crc kubenswrapper[4720]: I0121 14:52:46.554830 4720 generic.go:334] "Generic (PLEG): container finished" podID="4906b5ed-c663-4e81-ab33-2b8f33777cd1" containerID="18cfbfd34acd13af24c666d9d7a73718d1d3050e2f8b8e7529a2094d7947823e" exitCode=0 Jan 21 14:52:46 crc kubenswrapper[4720]: I0121 14:52:46.554874 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4906b5ed-c663-4e81-ab33-2b8f33777cd1","Type":"ContainerDied","Data":"18cfbfd34acd13af24c666d9d7a73718d1d3050e2f8b8e7529a2094d7947823e"} Jan 21 14:52:46 crc kubenswrapper[4720]: I0121 14:52:46.590369 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.590347733 podStartE2EDuration="36.590347733s" podCreationTimestamp="2026-01-21 14:52:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:52:46.576725508 +0000 UTC m=+1404.485465450" watchObservedRunningTime="2026-01-21 14:52:46.590347733 +0000 UTC m=+1404.499087675" Jan 21 14:52:46 crc kubenswrapper[4720]: I0121 14:52:46.594325 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5cdm8" Jan 21 14:52:46 crc kubenswrapper[4720]: I0121 14:52:46.594387 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5cdm8" Jan 21 14:52:46 crc kubenswrapper[4720]: I0121 14:52:46.645977 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5cdm8" Jan 21 14:52:47 crc kubenswrapper[4720]: I0121 14:52:47.617094 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5cdm8" Jan 21 14:52:47 crc kubenswrapper[4720]: I0121 14:52:47.670153 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5cdm8"] Jan 21 14:52:49 crc kubenswrapper[4720]: I0121 14:52:49.580746 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5cdm8" podUID="35639e0c-f3bb-48c7-9879-442aff2fcdbc" containerName="registry-server" containerID="cri-o://b7a20252450813cdcecb54726afdb172a34e5c4adc5f07b8452b3e8875d4e007" gracePeriod=2 Jan 21 14:52:50 crc kubenswrapper[4720]: I0121 14:52:50.595484 4720 generic.go:334] "Generic (PLEG): container finished" podID="35639e0c-f3bb-48c7-9879-442aff2fcdbc" containerID="b7a20252450813cdcecb54726afdb172a34e5c4adc5f07b8452b3e8875d4e007" exitCode=0 Jan 21 14:52:50 crc kubenswrapper[4720]: I0121 14:52:50.595540 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5cdm8" event={"ID":"35639e0c-f3bb-48c7-9879-442aff2fcdbc","Type":"ContainerDied","Data":"b7a20252450813cdcecb54726afdb172a34e5c4adc5f07b8452b3e8875d4e007"} Jan 21 14:52:52 crc kubenswrapper[4720]: I0121 14:52:52.881786 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:52:52 crc kubenswrapper[4720]: I0121 14:52:52.882215 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.088095 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5cdm8" Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.185669 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35639e0c-f3bb-48c7-9879-442aff2fcdbc-utilities\") pod \"35639e0c-f3bb-48c7-9879-442aff2fcdbc\" (UID: \"35639e0c-f3bb-48c7-9879-442aff2fcdbc\") " Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.185751 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8hh6\" (UniqueName: \"kubernetes.io/projected/35639e0c-f3bb-48c7-9879-442aff2fcdbc-kube-api-access-g8hh6\") pod \"35639e0c-f3bb-48c7-9879-442aff2fcdbc\" (UID: \"35639e0c-f3bb-48c7-9879-442aff2fcdbc\") " Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.185905 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35639e0c-f3bb-48c7-9879-442aff2fcdbc-catalog-content\") pod \"35639e0c-f3bb-48c7-9879-442aff2fcdbc\" (UID: \"35639e0c-f3bb-48c7-9879-442aff2fcdbc\") " Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.186809 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35639e0c-f3bb-48c7-9879-442aff2fcdbc-utilities" (OuterVolumeSpecName: "utilities") pod "35639e0c-f3bb-48c7-9879-442aff2fcdbc" (UID: "35639e0c-f3bb-48c7-9879-442aff2fcdbc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.189455 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35639e0c-f3bb-48c7-9879-442aff2fcdbc-kube-api-access-g8hh6" (OuterVolumeSpecName: "kube-api-access-g8hh6") pod "35639e0c-f3bb-48c7-9879-442aff2fcdbc" (UID: "35639e0c-f3bb-48c7-9879-442aff2fcdbc"). InnerVolumeSpecName "kube-api-access-g8hh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.235921 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35639e0c-f3bb-48c7-9879-442aff2fcdbc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "35639e0c-f3bb-48c7-9879-442aff2fcdbc" (UID: "35639e0c-f3bb-48c7-9879-442aff2fcdbc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.287759 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8hh6\" (UniqueName: \"kubernetes.io/projected/35639e0c-f3bb-48c7-9879-442aff2fcdbc-kube-api-access-g8hh6\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.287797 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35639e0c-f3bb-48c7-9879-442aff2fcdbc-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.287809 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35639e0c-f3bb-48c7-9879-442aff2fcdbc-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.632949 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" event={"ID":"0506243d-6216-4541-8f14-8b2c2beb409b","Type":"ContainerStarted","Data":"5133cb32808a6268b3ae340020a5ce8d3435cdf1af5aee0578ce67779a55b8e4"} Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.634876 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4906b5ed-c663-4e81-ab33-2b8f33777cd1","Type":"ContainerStarted","Data":"6306cc8c931fa6efdfe4271cdd0ca55e0e1479ce75a7982c399644765f122a3b"} Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.635071 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.636738 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5cdm8" event={"ID":"35639e0c-f3bb-48c7-9879-442aff2fcdbc","Type":"ContainerDied","Data":"e36a7ccb75bfaef518302425aaa5615316e673c41c0f4ff8d85f9079db99588e"} Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.636768 4720 scope.go:117] "RemoveContainer" containerID="b7a20252450813cdcecb54726afdb172a34e5c4adc5f07b8452b3e8875d4e007" Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.636809 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5cdm8" Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.656012 4720 scope.go:117] "RemoveContainer" containerID="eb22a6e0337b0f3e36f438e6a72843a04dd1a8862e6d1d19df010bf8b20418ee" Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.658298 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" podStartSLOduration=2.500018223 podStartE2EDuration="12.658286675s" podCreationTimestamp="2026-01-21 14:52:41 +0000 UTC" firstStartedPulling="2026-01-21 14:52:42.745551778 +0000 UTC m=+1400.654291700" lastFinishedPulling="2026-01-21 14:52:52.90382022 +0000 UTC m=+1410.812560152" observedRunningTime="2026-01-21 14:52:53.649928302 +0000 UTC m=+1411.558668254" watchObservedRunningTime="2026-01-21 14:52:53.658286675 +0000 UTC m=+1411.567026607" Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.682925 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=42.682910028 podStartE2EDuration="42.682910028s" podCreationTimestamp="2026-01-21 14:52:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:52:53.676641597 +0000 UTC m=+1411.585381529" watchObservedRunningTime="2026-01-21 14:52:53.682910028 +0000 UTC m=+1411.591649960" Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.703837 4720 scope.go:117] "RemoveContainer" containerID="514634cf38bfc52fe83d46810491a641041e45c3b3ded3080688a74aa08d5e9e" Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.711625 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5cdm8"] Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.727484 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5cdm8"] Jan 21 14:52:54 crc kubenswrapper[4720]: I0121 14:52:54.688784 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35639e0c-f3bb-48c7-9879-442aff2fcdbc" path="/var/lib/kubelet/pods/35639e0c-f3bb-48c7-9879-442aff2fcdbc/volumes" Jan 21 14:53:00 crc kubenswrapper[4720]: I0121 14:53:00.646837 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 21 14:53:05 crc kubenswrapper[4720]: I0121 14:53:05.750319 4720 generic.go:334] "Generic (PLEG): container finished" podID="0506243d-6216-4541-8f14-8b2c2beb409b" containerID="5133cb32808a6268b3ae340020a5ce8d3435cdf1af5aee0578ce67779a55b8e4" exitCode=0 Jan 21 14:53:05 crc kubenswrapper[4720]: I0121 14:53:05.750446 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" event={"ID":"0506243d-6216-4541-8f14-8b2c2beb409b","Type":"ContainerDied","Data":"5133cb32808a6268b3ae340020a5ce8d3435cdf1af5aee0578ce67779a55b8e4"} Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.234904 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.360388 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0506243d-6216-4541-8f14-8b2c2beb409b-inventory\") pod \"0506243d-6216-4541-8f14-8b2c2beb409b\" (UID: \"0506243d-6216-4541-8f14-8b2c2beb409b\") " Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.360773 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4ff4\" (UniqueName: \"kubernetes.io/projected/0506243d-6216-4541-8f14-8b2c2beb409b-kube-api-access-z4ff4\") pod \"0506243d-6216-4541-8f14-8b2c2beb409b\" (UID: \"0506243d-6216-4541-8f14-8b2c2beb409b\") " Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.360928 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0506243d-6216-4541-8f14-8b2c2beb409b-repo-setup-combined-ca-bundle\") pod \"0506243d-6216-4541-8f14-8b2c2beb409b\" (UID: \"0506243d-6216-4541-8f14-8b2c2beb409b\") " Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.361101 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0506243d-6216-4541-8f14-8b2c2beb409b-ssh-key-openstack-edpm-ipam\") pod \"0506243d-6216-4541-8f14-8b2c2beb409b\" (UID: \"0506243d-6216-4541-8f14-8b2c2beb409b\") " Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.366015 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0506243d-6216-4541-8f14-8b2c2beb409b-kube-api-access-z4ff4" (OuterVolumeSpecName: "kube-api-access-z4ff4") pod "0506243d-6216-4541-8f14-8b2c2beb409b" (UID: "0506243d-6216-4541-8f14-8b2c2beb409b"). InnerVolumeSpecName "kube-api-access-z4ff4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.366385 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0506243d-6216-4541-8f14-8b2c2beb409b-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "0506243d-6216-4541-8f14-8b2c2beb409b" (UID: "0506243d-6216-4541-8f14-8b2c2beb409b"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.387960 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0506243d-6216-4541-8f14-8b2c2beb409b-inventory" (OuterVolumeSpecName: "inventory") pod "0506243d-6216-4541-8f14-8b2c2beb409b" (UID: "0506243d-6216-4541-8f14-8b2c2beb409b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.403483 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0506243d-6216-4541-8f14-8b2c2beb409b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0506243d-6216-4541-8f14-8b2c2beb409b" (UID: "0506243d-6216-4541-8f14-8b2c2beb409b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.462989 4720 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0506243d-6216-4541-8f14-8b2c2beb409b-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.463025 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4ff4\" (UniqueName: \"kubernetes.io/projected/0506243d-6216-4541-8f14-8b2c2beb409b-kube-api-access-z4ff4\") on node \"crc\" DevicePath \"\"" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.463040 4720 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0506243d-6216-4541-8f14-8b2c2beb409b-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.463052 4720 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0506243d-6216-4541-8f14-8b2c2beb409b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.771066 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" event={"ID":"0506243d-6216-4541-8f14-8b2c2beb409b","Type":"ContainerDied","Data":"de902a439b17ff2cb12404864917c3124a0c9d6ef8eb23dc1bc69b8a1f6ee5bc"} Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.771310 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de902a439b17ff2cb12404864917c3124a0c9d6ef8eb23dc1bc69b8a1f6ee5bc" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.771412 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" Jan 21 14:53:07 crc kubenswrapper[4720]: E0121 14:53:07.891255 4720 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0506243d_6216_4541_8f14_8b2c2beb409b.slice\": RecentStats: unable to find data in memory cache]" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.897401 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6"] Jan 21 14:53:07 crc kubenswrapper[4720]: E0121 14:53:07.898092 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35639e0c-f3bb-48c7-9879-442aff2fcdbc" containerName="registry-server" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.898115 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="35639e0c-f3bb-48c7-9879-442aff2fcdbc" containerName="registry-server" Jan 21 14:53:07 crc kubenswrapper[4720]: E0121 14:53:07.898145 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35639e0c-f3bb-48c7-9879-442aff2fcdbc" containerName="extract-utilities" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.898154 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="35639e0c-f3bb-48c7-9879-442aff2fcdbc" containerName="extract-utilities" Jan 21 14:53:07 crc kubenswrapper[4720]: E0121 14:53:07.898185 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0506243d-6216-4541-8f14-8b2c2beb409b" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.898193 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="0506243d-6216-4541-8f14-8b2c2beb409b" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 21 14:53:07 crc kubenswrapper[4720]: E0121 14:53:07.898231 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35639e0c-f3bb-48c7-9879-442aff2fcdbc" containerName="extract-content" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.898240 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="35639e0c-f3bb-48c7-9879-442aff2fcdbc" containerName="extract-content" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.898543 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="35639e0c-f3bb-48c7-9879-442aff2fcdbc" containerName="registry-server" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.898574 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="0506243d-6216-4541-8f14-8b2c2beb409b" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.899596 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.909336 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-q5rkp" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.909867 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.910114 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.910439 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.934606 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6"] Jan 21 14:53:08 crc kubenswrapper[4720]: I0121 14:53:08.075561 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtctn\" (UniqueName: \"kubernetes.io/projected/b96fb314-d163-41a0-b2b0-9a9c117d504c-kube-api-access-qtctn\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6\" (UID: \"b96fb314-d163-41a0-b2b0-9a9c117d504c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" Jan 21 14:53:08 crc kubenswrapper[4720]: I0121 14:53:08.075716 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b96fb314-d163-41a0-b2b0-9a9c117d504c-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6\" (UID: \"b96fb314-d163-41a0-b2b0-9a9c117d504c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" Jan 21 14:53:08 crc kubenswrapper[4720]: I0121 14:53:08.075763 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b96fb314-d163-41a0-b2b0-9a9c117d504c-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6\" (UID: \"b96fb314-d163-41a0-b2b0-9a9c117d504c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" Jan 21 14:53:08 crc kubenswrapper[4720]: I0121 14:53:08.075807 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b96fb314-d163-41a0-b2b0-9a9c117d504c-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6\" (UID: \"b96fb314-d163-41a0-b2b0-9a9c117d504c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" Jan 21 14:53:08 crc kubenswrapper[4720]: I0121 14:53:08.177031 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b96fb314-d163-41a0-b2b0-9a9c117d504c-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6\" (UID: \"b96fb314-d163-41a0-b2b0-9a9c117d504c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" Jan 21 14:53:08 crc kubenswrapper[4720]: I0121 14:53:08.177278 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b96fb314-d163-41a0-b2b0-9a9c117d504c-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6\" (UID: \"b96fb314-d163-41a0-b2b0-9a9c117d504c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" Jan 21 14:53:08 crc kubenswrapper[4720]: I0121 14:53:08.177322 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b96fb314-d163-41a0-b2b0-9a9c117d504c-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6\" (UID: \"b96fb314-d163-41a0-b2b0-9a9c117d504c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" Jan 21 14:53:08 crc kubenswrapper[4720]: I0121 14:53:08.177366 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtctn\" (UniqueName: \"kubernetes.io/projected/b96fb314-d163-41a0-b2b0-9a9c117d504c-kube-api-access-qtctn\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6\" (UID: \"b96fb314-d163-41a0-b2b0-9a9c117d504c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" Jan 21 14:53:08 crc kubenswrapper[4720]: I0121 14:53:08.192079 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtctn\" (UniqueName: \"kubernetes.io/projected/b96fb314-d163-41a0-b2b0-9a9c117d504c-kube-api-access-qtctn\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6\" (UID: \"b96fb314-d163-41a0-b2b0-9a9c117d504c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" Jan 21 14:53:08 crc kubenswrapper[4720]: I0121 14:53:08.192428 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b96fb314-d163-41a0-b2b0-9a9c117d504c-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6\" (UID: \"b96fb314-d163-41a0-b2b0-9a9c117d504c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" Jan 21 14:53:08 crc kubenswrapper[4720]: I0121 14:53:08.193309 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b96fb314-d163-41a0-b2b0-9a9c117d504c-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6\" (UID: \"b96fb314-d163-41a0-b2b0-9a9c117d504c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" Jan 21 14:53:08 crc kubenswrapper[4720]: I0121 14:53:08.194811 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b96fb314-d163-41a0-b2b0-9a9c117d504c-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6\" (UID: \"b96fb314-d163-41a0-b2b0-9a9c117d504c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" Jan 21 14:53:08 crc kubenswrapper[4720]: I0121 14:53:08.259809 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" Jan 21 14:53:08 crc kubenswrapper[4720]: I0121 14:53:08.832927 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6"] Jan 21 14:53:08 crc kubenswrapper[4720]: W0121 14:53:08.837862 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb96fb314_d163_41a0_b2b0_9a9c117d504c.slice/crio-6d879920ef670137abce52863a1d8186ab37a8e794f34f027602532736f19ca9 WatchSource:0}: Error finding container 6d879920ef670137abce52863a1d8186ab37a8e794f34f027602532736f19ca9: Status 404 returned error can't find the container with id 6d879920ef670137abce52863a1d8186ab37a8e794f34f027602532736f19ca9 Jan 21 14:53:09 crc kubenswrapper[4720]: I0121 14:53:09.790487 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" event={"ID":"b96fb314-d163-41a0-b2b0-9a9c117d504c","Type":"ContainerStarted","Data":"66f01845f928a8df606e46c99755ec0f7e0b42c20c551824d6f9b7cd860dc1a5"} Jan 21 14:53:09 crc kubenswrapper[4720]: I0121 14:53:09.790998 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" event={"ID":"b96fb314-d163-41a0-b2b0-9a9c117d504c","Type":"ContainerStarted","Data":"6d879920ef670137abce52863a1d8186ab37a8e794f34f027602532736f19ca9"} Jan 21 14:53:09 crc kubenswrapper[4720]: I0121 14:53:09.815668 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" podStartSLOduration=2.388559455 podStartE2EDuration="2.815634737s" podCreationTimestamp="2026-01-21 14:53:07 +0000 UTC" firstStartedPulling="2026-01-21 14:53:08.840965977 +0000 UTC m=+1426.749705919" lastFinishedPulling="2026-01-21 14:53:09.268041269 +0000 UTC m=+1427.176781201" observedRunningTime="2026-01-21 14:53:09.815528424 +0000 UTC m=+1427.724268376" watchObservedRunningTime="2026-01-21 14:53:09.815634737 +0000 UTC m=+1427.724374669" Jan 21 14:53:11 crc kubenswrapper[4720]: I0121 14:53:11.238821 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6vs9g"] Jan 21 14:53:11 crc kubenswrapper[4720]: I0121 14:53:11.240799 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6vs9g" Jan 21 14:53:11 crc kubenswrapper[4720]: I0121 14:53:11.253487 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6vs9g"] Jan 21 14:53:11 crc kubenswrapper[4720]: I0121 14:53:11.334177 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qn2d\" (UniqueName: \"kubernetes.io/projected/d3eca900-8aa2-4835-9864-c67e98b7172e-kube-api-access-7qn2d\") pod \"certified-operators-6vs9g\" (UID: \"d3eca900-8aa2-4835-9864-c67e98b7172e\") " pod="openshift-marketplace/certified-operators-6vs9g" Jan 21 14:53:11 crc kubenswrapper[4720]: I0121 14:53:11.334478 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3eca900-8aa2-4835-9864-c67e98b7172e-catalog-content\") pod \"certified-operators-6vs9g\" (UID: \"d3eca900-8aa2-4835-9864-c67e98b7172e\") " pod="openshift-marketplace/certified-operators-6vs9g" Jan 21 14:53:11 crc kubenswrapper[4720]: I0121 14:53:11.334526 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3eca900-8aa2-4835-9864-c67e98b7172e-utilities\") pod \"certified-operators-6vs9g\" (UID: \"d3eca900-8aa2-4835-9864-c67e98b7172e\") " pod="openshift-marketplace/certified-operators-6vs9g" Jan 21 14:53:11 crc kubenswrapper[4720]: I0121 14:53:11.436608 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3eca900-8aa2-4835-9864-c67e98b7172e-utilities\") pod \"certified-operators-6vs9g\" (UID: \"d3eca900-8aa2-4835-9864-c67e98b7172e\") " pod="openshift-marketplace/certified-operators-6vs9g" Jan 21 14:53:11 crc kubenswrapper[4720]: I0121 14:53:11.436833 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qn2d\" (UniqueName: \"kubernetes.io/projected/d3eca900-8aa2-4835-9864-c67e98b7172e-kube-api-access-7qn2d\") pod \"certified-operators-6vs9g\" (UID: \"d3eca900-8aa2-4835-9864-c67e98b7172e\") " pod="openshift-marketplace/certified-operators-6vs9g" Jan 21 14:53:11 crc kubenswrapper[4720]: I0121 14:53:11.436874 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3eca900-8aa2-4835-9864-c67e98b7172e-catalog-content\") pod \"certified-operators-6vs9g\" (UID: \"d3eca900-8aa2-4835-9864-c67e98b7172e\") " pod="openshift-marketplace/certified-operators-6vs9g" Jan 21 14:53:11 crc kubenswrapper[4720]: I0121 14:53:11.437260 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3eca900-8aa2-4835-9864-c67e98b7172e-utilities\") pod \"certified-operators-6vs9g\" (UID: \"d3eca900-8aa2-4835-9864-c67e98b7172e\") " pod="openshift-marketplace/certified-operators-6vs9g" Jan 21 14:53:11 crc kubenswrapper[4720]: I0121 14:53:11.437316 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3eca900-8aa2-4835-9864-c67e98b7172e-catalog-content\") pod \"certified-operators-6vs9g\" (UID: \"d3eca900-8aa2-4835-9864-c67e98b7172e\") " pod="openshift-marketplace/certified-operators-6vs9g" Jan 21 14:53:11 crc kubenswrapper[4720]: I0121 14:53:11.460143 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qn2d\" (UniqueName: \"kubernetes.io/projected/d3eca900-8aa2-4835-9864-c67e98b7172e-kube-api-access-7qn2d\") pod \"certified-operators-6vs9g\" (UID: \"d3eca900-8aa2-4835-9864-c67e98b7172e\") " pod="openshift-marketplace/certified-operators-6vs9g" Jan 21 14:53:11 crc kubenswrapper[4720]: I0121 14:53:11.567915 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6vs9g" Jan 21 14:53:11 crc kubenswrapper[4720]: I0121 14:53:11.666075 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:53:12 crc kubenswrapper[4720]: I0121 14:53:12.085421 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6vs9g"] Jan 21 14:53:12 crc kubenswrapper[4720]: W0121 14:53:12.086039 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3eca900_8aa2_4835_9864_c67e98b7172e.slice/crio-6ef6f8a8745a1fd25538e6a623a0173f200dba606dd9336f3b9cba1094620334 WatchSource:0}: Error finding container 6ef6f8a8745a1fd25538e6a623a0173f200dba606dd9336f3b9cba1094620334: Status 404 returned error can't find the container with id 6ef6f8a8745a1fd25538e6a623a0173f200dba606dd9336f3b9cba1094620334 Jan 21 14:53:12 crc kubenswrapper[4720]: I0121 14:53:12.828850 4720 generic.go:334] "Generic (PLEG): container finished" podID="d3eca900-8aa2-4835-9864-c67e98b7172e" containerID="fb9dab8606bf46f6282ba5f500ddd93bc21258fdf3daaffde6a10aad43740dd1" exitCode=0 Jan 21 14:53:12 crc kubenswrapper[4720]: I0121 14:53:12.829181 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6vs9g" event={"ID":"d3eca900-8aa2-4835-9864-c67e98b7172e","Type":"ContainerDied","Data":"fb9dab8606bf46f6282ba5f500ddd93bc21258fdf3daaffde6a10aad43740dd1"} Jan 21 14:53:12 crc kubenswrapper[4720]: I0121 14:53:12.829215 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6vs9g" event={"ID":"d3eca900-8aa2-4835-9864-c67e98b7172e","Type":"ContainerStarted","Data":"6ef6f8a8745a1fd25538e6a623a0173f200dba606dd9336f3b9cba1094620334"} Jan 21 14:53:13 crc kubenswrapper[4720]: I0121 14:53:13.843902 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6vs9g" event={"ID":"d3eca900-8aa2-4835-9864-c67e98b7172e","Type":"ContainerStarted","Data":"8bc1514448df4c980dea663a33ee6c0eab594d105324df98f1363d892be25a1b"} Jan 21 14:53:15 crc kubenswrapper[4720]: I0121 14:53:15.886031 4720 generic.go:334] "Generic (PLEG): container finished" podID="d3eca900-8aa2-4835-9864-c67e98b7172e" containerID="8bc1514448df4c980dea663a33ee6c0eab594d105324df98f1363d892be25a1b" exitCode=0 Jan 21 14:53:15 crc kubenswrapper[4720]: I0121 14:53:15.886830 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6vs9g" event={"ID":"d3eca900-8aa2-4835-9864-c67e98b7172e","Type":"ContainerDied","Data":"8bc1514448df4c980dea663a33ee6c0eab594d105324df98f1363d892be25a1b"} Jan 21 14:53:16 crc kubenswrapper[4720]: I0121 14:53:16.898432 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6vs9g" event={"ID":"d3eca900-8aa2-4835-9864-c67e98b7172e","Type":"ContainerStarted","Data":"7f2326f414fdee3909bf2ae27cfa083a5fb3ce13aa7dafad0380c600b1dbab9e"} Jan 21 14:53:16 crc kubenswrapper[4720]: I0121 14:53:16.927353 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6vs9g" podStartSLOduration=2.374680274 podStartE2EDuration="5.927334207s" podCreationTimestamp="2026-01-21 14:53:11 +0000 UTC" firstStartedPulling="2026-01-21 14:53:12.830881527 +0000 UTC m=+1430.739621459" lastFinishedPulling="2026-01-21 14:53:16.38353542 +0000 UTC m=+1434.292275392" observedRunningTime="2026-01-21 14:53:16.918573903 +0000 UTC m=+1434.827313845" watchObservedRunningTime="2026-01-21 14:53:16.927334207 +0000 UTC m=+1434.836074149" Jan 21 14:53:21 crc kubenswrapper[4720]: I0121 14:53:21.568503 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6vs9g" Jan 21 14:53:21 crc kubenswrapper[4720]: I0121 14:53:21.569363 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6vs9g" Jan 21 14:53:21 crc kubenswrapper[4720]: I0121 14:53:21.619260 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6vs9g" Jan 21 14:53:21 crc kubenswrapper[4720]: I0121 14:53:21.978733 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6vs9g" Jan 21 14:53:22 crc kubenswrapper[4720]: I0121 14:53:22.029494 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6vs9g"] Jan 21 14:53:22 crc kubenswrapper[4720]: I0121 14:53:22.880023 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:53:22 crc kubenswrapper[4720]: I0121 14:53:22.880115 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:53:23 crc kubenswrapper[4720]: I0121 14:53:23.952996 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6vs9g" podUID="d3eca900-8aa2-4835-9864-c67e98b7172e" containerName="registry-server" containerID="cri-o://7f2326f414fdee3909bf2ae27cfa083a5fb3ce13aa7dafad0380c600b1dbab9e" gracePeriod=2 Jan 21 14:53:24 crc kubenswrapper[4720]: I0121 14:53:24.431564 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6vs9g" Jan 21 14:53:24 crc kubenswrapper[4720]: I0121 14:53:24.467258 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3eca900-8aa2-4835-9864-c67e98b7172e-catalog-content\") pod \"d3eca900-8aa2-4835-9864-c67e98b7172e\" (UID: \"d3eca900-8aa2-4835-9864-c67e98b7172e\") " Jan 21 14:53:24 crc kubenswrapper[4720]: I0121 14:53:24.467328 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qn2d\" (UniqueName: \"kubernetes.io/projected/d3eca900-8aa2-4835-9864-c67e98b7172e-kube-api-access-7qn2d\") pod \"d3eca900-8aa2-4835-9864-c67e98b7172e\" (UID: \"d3eca900-8aa2-4835-9864-c67e98b7172e\") " Jan 21 14:53:24 crc kubenswrapper[4720]: I0121 14:53:24.467459 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3eca900-8aa2-4835-9864-c67e98b7172e-utilities\") pod \"d3eca900-8aa2-4835-9864-c67e98b7172e\" (UID: \"d3eca900-8aa2-4835-9864-c67e98b7172e\") " Jan 21 14:53:24 crc kubenswrapper[4720]: I0121 14:53:24.468512 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3eca900-8aa2-4835-9864-c67e98b7172e-utilities" (OuterVolumeSpecName: "utilities") pod "d3eca900-8aa2-4835-9864-c67e98b7172e" (UID: "d3eca900-8aa2-4835-9864-c67e98b7172e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:53:24 crc kubenswrapper[4720]: I0121 14:53:24.474376 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3eca900-8aa2-4835-9864-c67e98b7172e-kube-api-access-7qn2d" (OuterVolumeSpecName: "kube-api-access-7qn2d") pod "d3eca900-8aa2-4835-9864-c67e98b7172e" (UID: "d3eca900-8aa2-4835-9864-c67e98b7172e"). InnerVolumeSpecName "kube-api-access-7qn2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:53:24 crc kubenswrapper[4720]: I0121 14:53:24.517484 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3eca900-8aa2-4835-9864-c67e98b7172e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d3eca900-8aa2-4835-9864-c67e98b7172e" (UID: "d3eca900-8aa2-4835-9864-c67e98b7172e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:53:24 crc kubenswrapper[4720]: I0121 14:53:24.569822 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3eca900-8aa2-4835-9864-c67e98b7172e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 14:53:24 crc kubenswrapper[4720]: I0121 14:53:24.569866 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qn2d\" (UniqueName: \"kubernetes.io/projected/d3eca900-8aa2-4835-9864-c67e98b7172e-kube-api-access-7qn2d\") on node \"crc\" DevicePath \"\"" Jan 21 14:53:24 crc kubenswrapper[4720]: I0121 14:53:24.569880 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3eca900-8aa2-4835-9864-c67e98b7172e-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 14:53:24 crc kubenswrapper[4720]: I0121 14:53:24.964497 4720 generic.go:334] "Generic (PLEG): container finished" podID="d3eca900-8aa2-4835-9864-c67e98b7172e" containerID="7f2326f414fdee3909bf2ae27cfa083a5fb3ce13aa7dafad0380c600b1dbab9e" exitCode=0 Jan 21 14:53:24 crc kubenswrapper[4720]: I0121 14:53:24.964536 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6vs9g" event={"ID":"d3eca900-8aa2-4835-9864-c67e98b7172e","Type":"ContainerDied","Data":"7f2326f414fdee3909bf2ae27cfa083a5fb3ce13aa7dafad0380c600b1dbab9e"} Jan 21 14:53:24 crc kubenswrapper[4720]: I0121 14:53:24.964559 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6vs9g" event={"ID":"d3eca900-8aa2-4835-9864-c67e98b7172e","Type":"ContainerDied","Data":"6ef6f8a8745a1fd25538e6a623a0173f200dba606dd9336f3b9cba1094620334"} Jan 21 14:53:24 crc kubenswrapper[4720]: I0121 14:53:24.964569 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6vs9g" Jan 21 14:53:24 crc kubenswrapper[4720]: I0121 14:53:24.964577 4720 scope.go:117] "RemoveContainer" containerID="7f2326f414fdee3909bf2ae27cfa083a5fb3ce13aa7dafad0380c600b1dbab9e" Jan 21 14:53:25 crc kubenswrapper[4720]: I0121 14:53:25.000440 4720 scope.go:117] "RemoveContainer" containerID="8bc1514448df4c980dea663a33ee6c0eab594d105324df98f1363d892be25a1b" Jan 21 14:53:25 crc kubenswrapper[4720]: I0121 14:53:25.003755 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6vs9g"] Jan 21 14:53:25 crc kubenswrapper[4720]: I0121 14:53:25.013232 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6vs9g"] Jan 21 14:53:25 crc kubenswrapper[4720]: I0121 14:53:25.031137 4720 scope.go:117] "RemoveContainer" containerID="fb9dab8606bf46f6282ba5f500ddd93bc21258fdf3daaffde6a10aad43740dd1" Jan 21 14:53:25 crc kubenswrapper[4720]: I0121 14:53:25.070289 4720 scope.go:117] "RemoveContainer" containerID="7f2326f414fdee3909bf2ae27cfa083a5fb3ce13aa7dafad0380c600b1dbab9e" Jan 21 14:53:25 crc kubenswrapper[4720]: E0121 14:53:25.070673 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f2326f414fdee3909bf2ae27cfa083a5fb3ce13aa7dafad0380c600b1dbab9e\": container with ID starting with 7f2326f414fdee3909bf2ae27cfa083a5fb3ce13aa7dafad0380c600b1dbab9e not found: ID does not exist" containerID="7f2326f414fdee3909bf2ae27cfa083a5fb3ce13aa7dafad0380c600b1dbab9e" Jan 21 14:53:25 crc kubenswrapper[4720]: I0121 14:53:25.070699 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f2326f414fdee3909bf2ae27cfa083a5fb3ce13aa7dafad0380c600b1dbab9e"} err="failed to get container status \"7f2326f414fdee3909bf2ae27cfa083a5fb3ce13aa7dafad0380c600b1dbab9e\": rpc error: code = NotFound desc = could not find container \"7f2326f414fdee3909bf2ae27cfa083a5fb3ce13aa7dafad0380c600b1dbab9e\": container with ID starting with 7f2326f414fdee3909bf2ae27cfa083a5fb3ce13aa7dafad0380c600b1dbab9e not found: ID does not exist" Jan 21 14:53:25 crc kubenswrapper[4720]: I0121 14:53:25.070719 4720 scope.go:117] "RemoveContainer" containerID="8bc1514448df4c980dea663a33ee6c0eab594d105324df98f1363d892be25a1b" Jan 21 14:53:25 crc kubenswrapper[4720]: E0121 14:53:25.071222 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bc1514448df4c980dea663a33ee6c0eab594d105324df98f1363d892be25a1b\": container with ID starting with 8bc1514448df4c980dea663a33ee6c0eab594d105324df98f1363d892be25a1b not found: ID does not exist" containerID="8bc1514448df4c980dea663a33ee6c0eab594d105324df98f1363d892be25a1b" Jan 21 14:53:25 crc kubenswrapper[4720]: I0121 14:53:25.071238 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bc1514448df4c980dea663a33ee6c0eab594d105324df98f1363d892be25a1b"} err="failed to get container status \"8bc1514448df4c980dea663a33ee6c0eab594d105324df98f1363d892be25a1b\": rpc error: code = NotFound desc = could not find container \"8bc1514448df4c980dea663a33ee6c0eab594d105324df98f1363d892be25a1b\": container with ID starting with 8bc1514448df4c980dea663a33ee6c0eab594d105324df98f1363d892be25a1b not found: ID does not exist" Jan 21 14:53:25 crc kubenswrapper[4720]: I0121 14:53:25.071253 4720 scope.go:117] "RemoveContainer" containerID="fb9dab8606bf46f6282ba5f500ddd93bc21258fdf3daaffde6a10aad43740dd1" Jan 21 14:53:25 crc kubenswrapper[4720]: E0121 14:53:25.071471 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb9dab8606bf46f6282ba5f500ddd93bc21258fdf3daaffde6a10aad43740dd1\": container with ID starting with fb9dab8606bf46f6282ba5f500ddd93bc21258fdf3daaffde6a10aad43740dd1 not found: ID does not exist" containerID="fb9dab8606bf46f6282ba5f500ddd93bc21258fdf3daaffde6a10aad43740dd1" Jan 21 14:53:25 crc kubenswrapper[4720]: I0121 14:53:25.071487 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb9dab8606bf46f6282ba5f500ddd93bc21258fdf3daaffde6a10aad43740dd1"} err="failed to get container status \"fb9dab8606bf46f6282ba5f500ddd93bc21258fdf3daaffde6a10aad43740dd1\": rpc error: code = NotFound desc = could not find container \"fb9dab8606bf46f6282ba5f500ddd93bc21258fdf3daaffde6a10aad43740dd1\": container with ID starting with fb9dab8606bf46f6282ba5f500ddd93bc21258fdf3daaffde6a10aad43740dd1 not found: ID does not exist" Jan 21 14:53:26 crc kubenswrapper[4720]: I0121 14:53:26.690382 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3eca900-8aa2-4835-9864-c67e98b7172e" path="/var/lib/kubelet/pods/d3eca900-8aa2-4835-9864-c67e98b7172e/volumes" Jan 21 14:53:52 crc kubenswrapper[4720]: I0121 14:53:52.879616 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:53:52 crc kubenswrapper[4720]: I0121 14:53:52.880429 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:53:52 crc kubenswrapper[4720]: I0121 14:53:52.880493 4720 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:53:52 crc kubenswrapper[4720]: I0121 14:53:52.881694 4720 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4a11e96f70bc2887e543718b48f5cffe20ea9e02702421d54bac9042ee7fd65f"} pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 14:53:52 crc kubenswrapper[4720]: I0121 14:53:52.881807 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" containerID="cri-o://4a11e96f70bc2887e543718b48f5cffe20ea9e02702421d54bac9042ee7fd65f" gracePeriod=600 Jan 21 14:53:53 crc kubenswrapper[4720]: I0121 14:53:53.214467 4720 generic.go:334] "Generic (PLEG): container finished" podID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerID="4a11e96f70bc2887e543718b48f5cffe20ea9e02702421d54bac9042ee7fd65f" exitCode=0 Jan 21 14:53:53 crc kubenswrapper[4720]: I0121 14:53:53.214551 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerDied","Data":"4a11e96f70bc2887e543718b48f5cffe20ea9e02702421d54bac9042ee7fd65f"} Jan 21 14:53:53 crc kubenswrapper[4720]: I0121 14:53:53.214771 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerStarted","Data":"c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827"} Jan 21 14:53:53 crc kubenswrapper[4720]: I0121 14:53:53.214786 4720 scope.go:117] "RemoveContainer" containerID="c955510d9d72215d99901afe6e11ff00ee6cb8f0d5290256bae37e29e3631aa6" Jan 21 14:53:57 crc kubenswrapper[4720]: I0121 14:53:57.967798 4720 scope.go:117] "RemoveContainer" containerID="cc3e9052ef84997a09ae1c29fb5eed4fd4dc22153bc67325317d7b50498a93b9" Jan 21 14:53:57 crc kubenswrapper[4720]: I0121 14:53:57.995872 4720 scope.go:117] "RemoveContainer" containerID="bc4beac3df68c3a4d150eba1728e09c2fdcdca24969df6e9d7185b1713f0ae4f" Jan 21 14:53:58 crc kubenswrapper[4720]: I0121 14:53:58.061380 4720 scope.go:117] "RemoveContainer" containerID="7fda82afe9e25635d25bfab63eae235397df92725d98016475c28391c7bd5687" Jan 21 14:54:58 crc kubenswrapper[4720]: I0121 14:54:58.191572 4720 scope.go:117] "RemoveContainer" containerID="59fd91b37bfcd11f4ff497c598ac3f209fb0f59dbb3d22d1cb6e9955f559e0d1" Jan 21 14:55:58 crc kubenswrapper[4720]: I0121 14:55:58.254138 4720 scope.go:117] "RemoveContainer" containerID="fe8f0f865bcfbac8500256bd0011d0f3321a6c7dc7b1a223783f54471eacf3d7" Jan 21 14:55:58 crc kubenswrapper[4720]: I0121 14:55:58.280059 4720 scope.go:117] "RemoveContainer" containerID="4a49a0860cc34aea77152e63d7f2664cc0101f7fb517d16de1561d7724f281fe" Jan 21 14:55:58 crc kubenswrapper[4720]: I0121 14:55:58.299724 4720 scope.go:117] "RemoveContainer" containerID="09f3932991cc54223a102a084d56fb2a6013a3367824ff852a79aaab841c7c9d" Jan 21 14:55:58 crc kubenswrapper[4720]: I0121 14:55:58.317629 4720 scope.go:117] "RemoveContainer" containerID="162849d5232970e8a4f401d6bc1bb8b7acd38c2e6e26bea6a8783902f8ef0d61" Jan 21 14:55:58 crc kubenswrapper[4720]: I0121 14:55:58.335518 4720 scope.go:117] "RemoveContainer" containerID="2656b4700e21f0c4fe6d2a6022d5d04628debe20176c13e5a7ff671b4ef6cfd2" Jan 21 14:55:58 crc kubenswrapper[4720]: I0121 14:55:58.352400 4720 scope.go:117] "RemoveContainer" containerID="80caf00710e62f883afdacc5c1851e84288e4f97b1d4abfb6840c04d5e1f8db2" Jan 21 14:55:58 crc kubenswrapper[4720]: I0121 14:55:58.370252 4720 scope.go:117] "RemoveContainer" containerID="7864be8ab599e4e4751d80908f487622fb05f60c6fa32cf32ec247ac04ec10ee" Jan 21 14:56:22 crc kubenswrapper[4720]: I0121 14:56:22.884234 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:56:22 crc kubenswrapper[4720]: I0121 14:56:22.885155 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:56:39 crc kubenswrapper[4720]: I0121 14:56:39.946351 4720 generic.go:334] "Generic (PLEG): container finished" podID="b96fb314-d163-41a0-b2b0-9a9c117d504c" containerID="66f01845f928a8df606e46c99755ec0f7e0b42c20c551824d6f9b7cd860dc1a5" exitCode=0 Jan 21 14:56:39 crc kubenswrapper[4720]: I0121 14:56:39.946454 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" event={"ID":"b96fb314-d163-41a0-b2b0-9a9c117d504c","Type":"ContainerDied","Data":"66f01845f928a8df606e46c99755ec0f7e0b42c20c551824d6f9b7cd860dc1a5"} Jan 21 14:56:41 crc kubenswrapper[4720]: I0121 14:56:41.337878 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" Jan 21 14:56:41 crc kubenswrapper[4720]: I0121 14:56:41.527904 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b96fb314-d163-41a0-b2b0-9a9c117d504c-inventory\") pod \"b96fb314-d163-41a0-b2b0-9a9c117d504c\" (UID: \"b96fb314-d163-41a0-b2b0-9a9c117d504c\") " Jan 21 14:56:41 crc kubenswrapper[4720]: I0121 14:56:41.528103 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtctn\" (UniqueName: \"kubernetes.io/projected/b96fb314-d163-41a0-b2b0-9a9c117d504c-kube-api-access-qtctn\") pod \"b96fb314-d163-41a0-b2b0-9a9c117d504c\" (UID: \"b96fb314-d163-41a0-b2b0-9a9c117d504c\") " Jan 21 14:56:41 crc kubenswrapper[4720]: I0121 14:56:41.528133 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b96fb314-d163-41a0-b2b0-9a9c117d504c-ssh-key-openstack-edpm-ipam\") pod \"b96fb314-d163-41a0-b2b0-9a9c117d504c\" (UID: \"b96fb314-d163-41a0-b2b0-9a9c117d504c\") " Jan 21 14:56:41 crc kubenswrapper[4720]: I0121 14:56:41.528150 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b96fb314-d163-41a0-b2b0-9a9c117d504c-bootstrap-combined-ca-bundle\") pod \"b96fb314-d163-41a0-b2b0-9a9c117d504c\" (UID: \"b96fb314-d163-41a0-b2b0-9a9c117d504c\") " Jan 21 14:56:41 crc kubenswrapper[4720]: I0121 14:56:41.659546 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b96fb314-d163-41a0-b2b0-9a9c117d504c-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "b96fb314-d163-41a0-b2b0-9a9c117d504c" (UID: "b96fb314-d163-41a0-b2b0-9a9c117d504c"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:56:41 crc kubenswrapper[4720]: I0121 14:56:41.659647 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b96fb314-d163-41a0-b2b0-9a9c117d504c-kube-api-access-qtctn" (OuterVolumeSpecName: "kube-api-access-qtctn") pod "b96fb314-d163-41a0-b2b0-9a9c117d504c" (UID: "b96fb314-d163-41a0-b2b0-9a9c117d504c"). InnerVolumeSpecName "kube-api-access-qtctn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:56:41 crc kubenswrapper[4720]: I0121 14:56:41.664186 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b96fb314-d163-41a0-b2b0-9a9c117d504c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b96fb314-d163-41a0-b2b0-9a9c117d504c" (UID: "b96fb314-d163-41a0-b2b0-9a9c117d504c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:56:41 crc kubenswrapper[4720]: I0121 14:56:41.664441 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b96fb314-d163-41a0-b2b0-9a9c117d504c-inventory" (OuterVolumeSpecName: "inventory") pod "b96fb314-d163-41a0-b2b0-9a9c117d504c" (UID: "b96fb314-d163-41a0-b2b0-9a9c117d504c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:56:41 crc kubenswrapper[4720]: I0121 14:56:41.731773 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtctn\" (UniqueName: \"kubernetes.io/projected/b96fb314-d163-41a0-b2b0-9a9c117d504c-kube-api-access-qtctn\") on node \"crc\" DevicePath \"\"" Jan 21 14:56:41 crc kubenswrapper[4720]: I0121 14:56:41.731809 4720 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b96fb314-d163-41a0-b2b0-9a9c117d504c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 14:56:41 crc kubenswrapper[4720]: I0121 14:56:41.731819 4720 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b96fb314-d163-41a0-b2b0-9a9c117d504c-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:56:41 crc kubenswrapper[4720]: I0121 14:56:41.731830 4720 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b96fb314-d163-41a0-b2b0-9a9c117d504c-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 14:56:41 crc kubenswrapper[4720]: I0121 14:56:41.970914 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" event={"ID":"b96fb314-d163-41a0-b2b0-9a9c117d504c","Type":"ContainerDied","Data":"6d879920ef670137abce52863a1d8186ab37a8e794f34f027602532736f19ca9"} Jan 21 14:56:41 crc kubenswrapper[4720]: I0121 14:56:41.970964 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d879920ef670137abce52863a1d8186ab37a8e794f34f027602532736f19ca9" Jan 21 14:56:41 crc kubenswrapper[4720]: I0121 14:56:41.971077 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.088304 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct"] Jan 21 14:56:42 crc kubenswrapper[4720]: E0121 14:56:42.089903 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b96fb314-d163-41a0-b2b0-9a9c117d504c" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.089923 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="b96fb314-d163-41a0-b2b0-9a9c117d504c" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 21 14:56:42 crc kubenswrapper[4720]: E0121 14:56:42.089951 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3eca900-8aa2-4835-9864-c67e98b7172e" containerName="extract-content" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.089958 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3eca900-8aa2-4835-9864-c67e98b7172e" containerName="extract-content" Jan 21 14:56:42 crc kubenswrapper[4720]: E0121 14:56:42.089968 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3eca900-8aa2-4835-9864-c67e98b7172e" containerName="extract-utilities" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.089974 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3eca900-8aa2-4835-9864-c67e98b7172e" containerName="extract-utilities" Jan 21 14:56:42 crc kubenswrapper[4720]: E0121 14:56:42.089994 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3eca900-8aa2-4835-9864-c67e98b7172e" containerName="registry-server" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.089999 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3eca900-8aa2-4835-9864-c67e98b7172e" containerName="registry-server" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.090187 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3eca900-8aa2-4835-9864-c67e98b7172e" containerName="registry-server" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.090203 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="b96fb314-d163-41a0-b2b0-9a9c117d504c" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.090831 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.093247 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.093487 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-q5rkp" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.093636 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.093798 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.103235 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct"] Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.239956 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e4bbdff-6382-41c7-a054-bb15c6923e32-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xdsct\" (UID: \"7e4bbdff-6382-41c7-a054-bb15c6923e32\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.240096 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tw9v\" (UniqueName: \"kubernetes.io/projected/7e4bbdff-6382-41c7-a054-bb15c6923e32-kube-api-access-9tw9v\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xdsct\" (UID: \"7e4bbdff-6382-41c7-a054-bb15c6923e32\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.240157 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e4bbdff-6382-41c7-a054-bb15c6923e32-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xdsct\" (UID: \"7e4bbdff-6382-41c7-a054-bb15c6923e32\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.342037 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e4bbdff-6382-41c7-a054-bb15c6923e32-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xdsct\" (UID: \"7e4bbdff-6382-41c7-a054-bb15c6923e32\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.342148 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tw9v\" (UniqueName: \"kubernetes.io/projected/7e4bbdff-6382-41c7-a054-bb15c6923e32-kube-api-access-9tw9v\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xdsct\" (UID: \"7e4bbdff-6382-41c7-a054-bb15c6923e32\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.342192 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e4bbdff-6382-41c7-a054-bb15c6923e32-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xdsct\" (UID: \"7e4bbdff-6382-41c7-a054-bb15c6923e32\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.346803 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e4bbdff-6382-41c7-a054-bb15c6923e32-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xdsct\" (UID: \"7e4bbdff-6382-41c7-a054-bb15c6923e32\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.347434 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e4bbdff-6382-41c7-a054-bb15c6923e32-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xdsct\" (UID: \"7e4bbdff-6382-41c7-a054-bb15c6923e32\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.369590 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tw9v\" (UniqueName: \"kubernetes.io/projected/7e4bbdff-6382-41c7-a054-bb15c6923e32-kube-api-access-9tw9v\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xdsct\" (UID: \"7e4bbdff-6382-41c7-a054-bb15c6923e32\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.417564 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct" Jan 21 14:56:43 crc kubenswrapper[4720]: I0121 14:56:43.030523 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct"] Jan 21 14:56:43 crc kubenswrapper[4720]: I0121 14:56:43.035895 4720 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 14:56:43 crc kubenswrapper[4720]: I0121 14:56:43.991486 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct" event={"ID":"7e4bbdff-6382-41c7-a054-bb15c6923e32","Type":"ContainerStarted","Data":"e468cd7143756cc451fdd7913ff8db25bab6eb6bfa9003d3cf6cdc7970cd5c98"} Jan 21 14:56:43 crc kubenswrapper[4720]: I0121 14:56:43.991849 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct" event={"ID":"7e4bbdff-6382-41c7-a054-bb15c6923e32","Type":"ContainerStarted","Data":"a9b310bef0977ddb0861adfd9f34f1856719d1601a607a8d4c42e05b686fefc0"} Jan 21 14:56:44 crc kubenswrapper[4720]: I0121 14:56:44.010753 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct" podStartSLOduration=1.493850815 podStartE2EDuration="2.010732775s" podCreationTimestamp="2026-01-21 14:56:42 +0000 UTC" firstStartedPulling="2026-01-21 14:56:43.03493696 +0000 UTC m=+1640.943676892" lastFinishedPulling="2026-01-21 14:56:43.55181892 +0000 UTC m=+1641.460558852" observedRunningTime="2026-01-21 14:56:44.010357616 +0000 UTC m=+1641.919097548" watchObservedRunningTime="2026-01-21 14:56:44.010732775 +0000 UTC m=+1641.919472727" Jan 21 14:56:52 crc kubenswrapper[4720]: I0121 14:56:52.879527 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:56:52 crc kubenswrapper[4720]: I0121 14:56:52.880037 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:56:58 crc kubenswrapper[4720]: I0121 14:56:58.424100 4720 scope.go:117] "RemoveContainer" containerID="0c9026115552582579bf8c91de9fceb499e94a991e4d85938cd66f4935bb22d8" Jan 21 14:56:59 crc kubenswrapper[4720]: I0121 14:56:59.048255 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-ckgkh"] Jan 21 14:56:59 crc kubenswrapper[4720]: I0121 14:56:59.056110 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-ckgkh"] Jan 21 14:57:00 crc kubenswrapper[4720]: I0121 14:57:00.032198 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-372a-account-create-update-w4xkf"] Jan 21 14:57:00 crc kubenswrapper[4720]: I0121 14:57:00.045963 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-372a-account-create-update-w4xkf"] Jan 21 14:57:00 crc kubenswrapper[4720]: I0121 14:57:00.691788 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d0385ad-a123-4c46-a96f-652dee1f89cd" path="/var/lib/kubelet/pods/0d0385ad-a123-4c46-a96f-652dee1f89cd/volumes" Jan 21 14:57:00 crc kubenswrapper[4720]: I0121 14:57:00.693470 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4bb55ed-9214-4f25-8740-ac50421baa4b" path="/var/lib/kubelet/pods/b4bb55ed-9214-4f25-8740-ac50421baa4b/volumes" Jan 21 14:57:04 crc kubenswrapper[4720]: I0121 14:57:04.039136 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-mcz8g"] Jan 21 14:57:04 crc kubenswrapper[4720]: I0121 14:57:04.048729 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-mcz8g"] Jan 21 14:57:04 crc kubenswrapper[4720]: I0121 14:57:04.697293 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="290dffa3-ed33-4571-aeb1-092aae1d8105" path="/var/lib/kubelet/pods/290dffa3-ed33-4571-aeb1-092aae1d8105/volumes" Jan 21 14:57:05 crc kubenswrapper[4720]: I0121 14:57:05.040723 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-55njq"] Jan 21 14:57:05 crc kubenswrapper[4720]: I0121 14:57:05.049201 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-318a-account-create-update-lkf6p"] Jan 21 14:57:05 crc kubenswrapper[4720]: I0121 14:57:05.059805 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-55njq"] Jan 21 14:57:05 crc kubenswrapper[4720]: I0121 14:57:05.072825 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-318a-account-create-update-lkf6p"] Jan 21 14:57:06 crc kubenswrapper[4720]: I0121 14:57:06.025096 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-06a3-account-create-update-dbk66"] Jan 21 14:57:06 crc kubenswrapper[4720]: I0121 14:57:06.034165 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-06a3-account-create-update-dbk66"] Jan 21 14:57:06 crc kubenswrapper[4720]: I0121 14:57:06.688612 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49f0b95b-6621-43fe-93c2-d4e7704f1f61" path="/var/lib/kubelet/pods/49f0b95b-6621-43fe-93c2-d4e7704f1f61/volumes" Jan 21 14:57:06 crc kubenswrapper[4720]: I0121 14:57:06.689768 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8161ded5-d8ab-48b7-9c1a-16a7155641d1" path="/var/lib/kubelet/pods/8161ded5-d8ab-48b7-9c1a-16a7155641d1/volumes" Jan 21 14:57:06 crc kubenswrapper[4720]: I0121 14:57:06.690512 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4fbe0fa-0158-480f-9f6d-2d589da3b91e" path="/var/lib/kubelet/pods/a4fbe0fa-0158-480f-9f6d-2d589da3b91e/volumes" Jan 21 14:57:22 crc kubenswrapper[4720]: I0121 14:57:22.042466 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-89dxv"] Jan 21 14:57:22 crc kubenswrapper[4720]: I0121 14:57:22.051487 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-89dxv"] Jan 21 14:57:22 crc kubenswrapper[4720]: I0121 14:57:22.688818 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fc2d647-37b6-4437-98fc-1d95af05cfe0" path="/var/lib/kubelet/pods/1fc2d647-37b6-4437-98fc-1d95af05cfe0/volumes" Jan 21 14:57:22 crc kubenswrapper[4720]: I0121 14:57:22.880147 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:57:22 crc kubenswrapper[4720]: I0121 14:57:22.880207 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:57:22 crc kubenswrapper[4720]: I0121 14:57:22.880261 4720 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:57:22 crc kubenswrapper[4720]: I0121 14:57:22.881198 4720 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827"} pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 14:57:22 crc kubenswrapper[4720]: I0121 14:57:22.881578 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" containerID="cri-o://c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" gracePeriod=600 Jan 21 14:57:23 crc kubenswrapper[4720]: E0121 14:57:23.001166 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 14:57:23 crc kubenswrapper[4720]: I0121 14:57:23.316530 4720 generic.go:334] "Generic (PLEG): container finished" podID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" exitCode=0 Jan 21 14:57:23 crc kubenswrapper[4720]: I0121 14:57:23.316579 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerDied","Data":"c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827"} Jan 21 14:57:23 crc kubenswrapper[4720]: I0121 14:57:23.316617 4720 scope.go:117] "RemoveContainer" containerID="4a11e96f70bc2887e543718b48f5cffe20ea9e02702421d54bac9042ee7fd65f" Jan 21 14:57:23 crc kubenswrapper[4720]: I0121 14:57:23.317360 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 14:57:23 crc kubenswrapper[4720]: E0121 14:57:23.317713 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 14:57:36 crc kubenswrapper[4720]: I0121 14:57:36.679022 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 14:57:36 crc kubenswrapper[4720]: E0121 14:57:36.680359 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 14:57:39 crc kubenswrapper[4720]: I0121 14:57:39.035439 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-dtj5w"] Jan 21 14:57:39 crc kubenswrapper[4720]: I0121 14:57:39.052706 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-dtj5w"] Jan 21 14:57:40 crc kubenswrapper[4720]: I0121 14:57:40.689344 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c40c650e-a05e-4cc0-88fa-d56eae92d29a" path="/var/lib/kubelet/pods/c40c650e-a05e-4cc0-88fa-d56eae92d29a/volumes" Jan 21 14:57:43 crc kubenswrapper[4720]: I0121 14:57:43.044672 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-qjpx9"] Jan 21 14:57:43 crc kubenswrapper[4720]: I0121 14:57:43.061583 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-qjpx9"] Jan 21 14:57:44 crc kubenswrapper[4720]: I0121 14:57:44.692620 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="077e6634-d42f-4765-ab65-9e24cf21a047" path="/var/lib/kubelet/pods/077e6634-d42f-4765-ab65-9e24cf21a047/volumes" Jan 21 14:57:47 crc kubenswrapper[4720]: I0121 14:57:47.039054 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-43f1-account-create-update-bsqmb"] Jan 21 14:57:47 crc kubenswrapper[4720]: I0121 14:57:47.059333 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-9105-account-create-update-h4nvp"] Jan 21 14:57:47 crc kubenswrapper[4720]: I0121 14:57:47.073080 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-9105-account-create-update-h4nvp"] Jan 21 14:57:47 crc kubenswrapper[4720]: I0121 14:57:47.084765 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-43f1-account-create-update-bsqmb"] Jan 21 14:57:48 crc kubenswrapper[4720]: I0121 14:57:48.035176 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-b1e4-account-create-update-qtmr9"] Jan 21 14:57:48 crc kubenswrapper[4720]: I0121 14:57:48.045204 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-pmrgf"] Jan 21 14:57:48 crc kubenswrapper[4720]: I0121 14:57:48.055457 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-b1e4-account-create-update-qtmr9"] Jan 21 14:57:48 crc kubenswrapper[4720]: I0121 14:57:48.062982 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-pmrgf"] Jan 21 14:57:48 crc kubenswrapper[4720]: I0121 14:57:48.068918 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-md2wm"] Jan 21 14:57:48 crc kubenswrapper[4720]: I0121 14:57:48.074717 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-md2wm"] Jan 21 14:57:48 crc kubenswrapper[4720]: I0121 14:57:48.678362 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 14:57:48 crc kubenswrapper[4720]: E0121 14:57:48.678623 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 14:57:48 crc kubenswrapper[4720]: I0121 14:57:48.693573 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ffa29ff-07bd-40cc-9853-a484f79b382f" path="/var/lib/kubelet/pods/5ffa29ff-07bd-40cc-9853-a484f79b382f/volumes" Jan 21 14:57:48 crc kubenswrapper[4720]: I0121 14:57:48.694336 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6545ddce-5b65-4702-9dee-2f2d9644123e" path="/var/lib/kubelet/pods/6545ddce-5b65-4702-9dee-2f2d9644123e/volumes" Jan 21 14:57:48 crc kubenswrapper[4720]: I0121 14:57:48.695123 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82f9f1ca-7fe3-4e17-8393-20364149010d" path="/var/lib/kubelet/pods/82f9f1ca-7fe3-4e17-8393-20364149010d/volumes" Jan 21 14:57:48 crc kubenswrapper[4720]: I0121 14:57:48.695821 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8da5c3a6-e588-412a-b884-7875fe439e61" path="/var/lib/kubelet/pods/8da5c3a6-e588-412a-b884-7875fe439e61/volumes" Jan 21 14:57:48 crc kubenswrapper[4720]: I0121 14:57:48.697304 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3a4204b-d91a-4d30-bea2-c327b452b61a" path="/var/lib/kubelet/pods/d3a4204b-d91a-4d30-bea2-c327b452b61a/volumes" Jan 21 14:57:52 crc kubenswrapper[4720]: I0121 14:57:52.034168 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-f47pm"] Jan 21 14:57:52 crc kubenswrapper[4720]: I0121 14:57:52.051148 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-f47pm"] Jan 21 14:57:52 crc kubenswrapper[4720]: I0121 14:57:52.693985 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17" path="/var/lib/kubelet/pods/3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17/volumes" Jan 21 14:57:58 crc kubenswrapper[4720]: I0121 14:57:58.477014 4720 scope.go:117] "RemoveContainer" containerID="41516602ff1ad171062abf2d068bab3f3ef63d954e1d46d8ab67f0a5722b61e9" Jan 21 14:57:58 crc kubenswrapper[4720]: I0121 14:57:58.529042 4720 scope.go:117] "RemoveContainer" containerID="7a64ef6d780ce73bbcb9b4e47639e6c2751ab6b42a36ab32810d2bb3c4c85044" Jan 21 14:57:58 crc kubenswrapper[4720]: I0121 14:57:58.555340 4720 scope.go:117] "RemoveContainer" containerID="307cb2943833035f93ad418790abe5b99a637ac449640923f1bf4d797ef693c9" Jan 21 14:57:58 crc kubenswrapper[4720]: I0121 14:57:58.594386 4720 scope.go:117] "RemoveContainer" containerID="8d5a885edcd4e22f1c2c16df333a61bd50d3383f3347aa464336e86a726533ed" Jan 21 14:57:58 crc kubenswrapper[4720]: I0121 14:57:58.642352 4720 scope.go:117] "RemoveContainer" containerID="abbb759ffaf221d0c9f8ed807f7987c4931c0626f086cc661e603dcc248f4947" Jan 21 14:57:58 crc kubenswrapper[4720]: I0121 14:57:58.673843 4720 scope.go:117] "RemoveContainer" containerID="3b809ce73b12339e4bd569ef93ddc354e0352255b11435e3d7ab7be025d1d6b0" Jan 21 14:57:58 crc kubenswrapper[4720]: I0121 14:57:58.709349 4720 scope.go:117] "RemoveContainer" containerID="77c8d16617de72e209afb71532a20278f4f6ca3c8ddea5a94d98282960f81a1c" Jan 21 14:57:58 crc kubenswrapper[4720]: I0121 14:57:58.781933 4720 scope.go:117] "RemoveContainer" containerID="b58fbfdd95d5a162cfec3d9e246f4a009ac8953ff289afaa9f7d6970293702c0" Jan 21 14:57:58 crc kubenswrapper[4720]: I0121 14:57:58.803787 4720 scope.go:117] "RemoveContainer" containerID="13d168c727b9d26f6f7317f1e362696e169d6ec9bb3d6175c527decee022cc0f" Jan 21 14:57:58 crc kubenswrapper[4720]: I0121 14:57:58.820086 4720 scope.go:117] "RemoveContainer" containerID="93fd560224a5890696cb0b97a0caeb546a3a0f6e334fb8c0f1cfda08ff3cdbe7" Jan 21 14:57:58 crc kubenswrapper[4720]: I0121 14:57:58.836586 4720 scope.go:117] "RemoveContainer" containerID="5d1f9a2280c4b827ded3a73860cfbf132b529c55e0547c798c884373c0113797" Jan 21 14:57:58 crc kubenswrapper[4720]: I0121 14:57:58.853302 4720 scope.go:117] "RemoveContainer" containerID="bbdc74de2b9aa9d89088725acd4c82b08706e4b50492cfbb262eba1e6a3ade4a" Jan 21 14:57:58 crc kubenswrapper[4720]: I0121 14:57:58.874326 4720 scope.go:117] "RemoveContainer" containerID="dae0e28936bcc6f5956c6eab724975a72ae35869b387709c9280dc4e17738181" Jan 21 14:57:58 crc kubenswrapper[4720]: I0121 14:57:58.900389 4720 scope.go:117] "RemoveContainer" containerID="f16aaabb5619940ea1f57988c30451dc484e4600daff1551a784f8d03b34d96d" Jan 21 14:57:58 crc kubenswrapper[4720]: I0121 14:57:58.916878 4720 scope.go:117] "RemoveContainer" containerID="0acbc31567e50b57eafcd661f7415e473d40a8ea1039c09546c667b2852b3e5b" Jan 21 14:57:59 crc kubenswrapper[4720]: I0121 14:57:59.662299 4720 generic.go:334] "Generic (PLEG): container finished" podID="7e4bbdff-6382-41c7-a054-bb15c6923e32" containerID="e468cd7143756cc451fdd7913ff8db25bab6eb6bfa9003d3cf6cdc7970cd5c98" exitCode=0 Jan 21 14:57:59 crc kubenswrapper[4720]: I0121 14:57:59.662459 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct" event={"ID":"7e4bbdff-6382-41c7-a054-bb15c6923e32","Type":"ContainerDied","Data":"e468cd7143756cc451fdd7913ff8db25bab6eb6bfa9003d3cf6cdc7970cd5c98"} Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.075898 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.147736 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e4bbdff-6382-41c7-a054-bb15c6923e32-ssh-key-openstack-edpm-ipam\") pod \"7e4bbdff-6382-41c7-a054-bb15c6923e32\" (UID: \"7e4bbdff-6382-41c7-a054-bb15c6923e32\") " Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.147994 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tw9v\" (UniqueName: \"kubernetes.io/projected/7e4bbdff-6382-41c7-a054-bb15c6923e32-kube-api-access-9tw9v\") pod \"7e4bbdff-6382-41c7-a054-bb15c6923e32\" (UID: \"7e4bbdff-6382-41c7-a054-bb15c6923e32\") " Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.148058 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e4bbdff-6382-41c7-a054-bb15c6923e32-inventory\") pod \"7e4bbdff-6382-41c7-a054-bb15c6923e32\" (UID: \"7e4bbdff-6382-41c7-a054-bb15c6923e32\") " Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.152727 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e4bbdff-6382-41c7-a054-bb15c6923e32-kube-api-access-9tw9v" (OuterVolumeSpecName: "kube-api-access-9tw9v") pod "7e4bbdff-6382-41c7-a054-bb15c6923e32" (UID: "7e4bbdff-6382-41c7-a054-bb15c6923e32"). InnerVolumeSpecName "kube-api-access-9tw9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.175685 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e4bbdff-6382-41c7-a054-bb15c6923e32-inventory" (OuterVolumeSpecName: "inventory") pod "7e4bbdff-6382-41c7-a054-bb15c6923e32" (UID: "7e4bbdff-6382-41c7-a054-bb15c6923e32"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.180269 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e4bbdff-6382-41c7-a054-bb15c6923e32-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7e4bbdff-6382-41c7-a054-bb15c6923e32" (UID: "7e4bbdff-6382-41c7-a054-bb15c6923e32"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.249874 4720 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e4bbdff-6382-41c7-a054-bb15c6923e32-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.249909 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tw9v\" (UniqueName: \"kubernetes.io/projected/7e4bbdff-6382-41c7-a054-bb15c6923e32-kube-api-access-9tw9v\") on node \"crc\" DevicePath \"\"" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.249919 4720 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e4bbdff-6382-41c7-a054-bb15c6923e32-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.682040 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct" event={"ID":"7e4bbdff-6382-41c7-a054-bb15c6923e32","Type":"ContainerDied","Data":"a9b310bef0977ddb0861adfd9f34f1856719d1601a607a8d4c42e05b686fefc0"} Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.682086 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9b310bef0977ddb0861adfd9f34f1856719d1601a607a8d4c42e05b686fefc0" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.682145 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.795166 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd"] Jan 21 14:58:01 crc kubenswrapper[4720]: E0121 14:58:01.795533 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e4bbdff-6382-41c7-a054-bb15c6923e32" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.795552 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e4bbdff-6382-41c7-a054-bb15c6923e32" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.795735 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e4bbdff-6382-41c7-a054-bb15c6923e32" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.796307 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.799532 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.799538 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.799817 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.799903 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-q5rkp" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.808957 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd"] Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.871289 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1708e39a-582c-42e2-8c2e-d71fef75a183-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd\" (UID: \"1708e39a-582c-42e2-8c2e-d71fef75a183\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.871538 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1708e39a-582c-42e2-8c2e-d71fef75a183-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd\" (UID: \"1708e39a-582c-42e2-8c2e-d71fef75a183\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.871842 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hgk5\" (UniqueName: \"kubernetes.io/projected/1708e39a-582c-42e2-8c2e-d71fef75a183-kube-api-access-2hgk5\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd\" (UID: \"1708e39a-582c-42e2-8c2e-d71fef75a183\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.973623 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hgk5\" (UniqueName: \"kubernetes.io/projected/1708e39a-582c-42e2-8c2e-d71fef75a183-kube-api-access-2hgk5\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd\" (UID: \"1708e39a-582c-42e2-8c2e-d71fef75a183\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.974075 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1708e39a-582c-42e2-8c2e-d71fef75a183-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd\" (UID: \"1708e39a-582c-42e2-8c2e-d71fef75a183\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.974130 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1708e39a-582c-42e2-8c2e-d71fef75a183-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd\" (UID: \"1708e39a-582c-42e2-8c2e-d71fef75a183\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.980164 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1708e39a-582c-42e2-8c2e-d71fef75a183-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd\" (UID: \"1708e39a-582c-42e2-8c2e-d71fef75a183\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.983133 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1708e39a-582c-42e2-8c2e-d71fef75a183-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd\" (UID: \"1708e39a-582c-42e2-8c2e-d71fef75a183\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.993417 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hgk5\" (UniqueName: \"kubernetes.io/projected/1708e39a-582c-42e2-8c2e-d71fef75a183-kube-api-access-2hgk5\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd\" (UID: \"1708e39a-582c-42e2-8c2e-d71fef75a183\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd" Jan 21 14:58:02 crc kubenswrapper[4720]: I0121 14:58:02.148929 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd" Jan 21 14:58:02 crc kubenswrapper[4720]: I0121 14:58:02.653641 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd"] Jan 21 14:58:02 crc kubenswrapper[4720]: I0121 14:58:02.689322 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 14:58:02 crc kubenswrapper[4720]: E0121 14:58:02.689675 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 14:58:02 crc kubenswrapper[4720]: I0121 14:58:02.704964 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd" event={"ID":"1708e39a-582c-42e2-8c2e-d71fef75a183","Type":"ContainerStarted","Data":"85968b6c8a32e79bef64859e9cbfa0fb3f67e8e1d38a94743b27db0c0bab4ed8"} Jan 21 14:58:03 crc kubenswrapper[4720]: I0121 14:58:03.716621 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd" event={"ID":"1708e39a-582c-42e2-8c2e-d71fef75a183","Type":"ContainerStarted","Data":"73061a839b8863ce34641584fb62efcd992e6515e42bf1c74ff8dba240765a88"} Jan 21 14:58:03 crc kubenswrapper[4720]: I0121 14:58:03.768438 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd" podStartSLOduration=2.250700711 podStartE2EDuration="2.768418323s" podCreationTimestamp="2026-01-21 14:58:01 +0000 UTC" firstStartedPulling="2026-01-21 14:58:02.662760091 +0000 UTC m=+1720.571500023" lastFinishedPulling="2026-01-21 14:58:03.180477683 +0000 UTC m=+1721.089217635" observedRunningTime="2026-01-21 14:58:03.734839569 +0000 UTC m=+1721.643579541" watchObservedRunningTime="2026-01-21 14:58:03.768418323 +0000 UTC m=+1721.677158255" Jan 21 14:58:08 crc kubenswrapper[4720]: I0121 14:58:08.769946 4720 generic.go:334] "Generic (PLEG): container finished" podID="1708e39a-582c-42e2-8c2e-d71fef75a183" containerID="73061a839b8863ce34641584fb62efcd992e6515e42bf1c74ff8dba240765a88" exitCode=0 Jan 21 14:58:08 crc kubenswrapper[4720]: I0121 14:58:08.770038 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd" event={"ID":"1708e39a-582c-42e2-8c2e-d71fef75a183","Type":"ContainerDied","Data":"73061a839b8863ce34641584fb62efcd992e6515e42bf1c74ff8dba240765a88"} Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.224139 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd" Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.233193 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1708e39a-582c-42e2-8c2e-d71fef75a183-inventory\") pod \"1708e39a-582c-42e2-8c2e-d71fef75a183\" (UID: \"1708e39a-582c-42e2-8c2e-d71fef75a183\") " Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.233326 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hgk5\" (UniqueName: \"kubernetes.io/projected/1708e39a-582c-42e2-8c2e-d71fef75a183-kube-api-access-2hgk5\") pod \"1708e39a-582c-42e2-8c2e-d71fef75a183\" (UID: \"1708e39a-582c-42e2-8c2e-d71fef75a183\") " Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.233557 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1708e39a-582c-42e2-8c2e-d71fef75a183-ssh-key-openstack-edpm-ipam\") pod \"1708e39a-582c-42e2-8c2e-d71fef75a183\" (UID: \"1708e39a-582c-42e2-8c2e-d71fef75a183\") " Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.241703 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1708e39a-582c-42e2-8c2e-d71fef75a183-kube-api-access-2hgk5" (OuterVolumeSpecName: "kube-api-access-2hgk5") pod "1708e39a-582c-42e2-8c2e-d71fef75a183" (UID: "1708e39a-582c-42e2-8c2e-d71fef75a183"). InnerVolumeSpecName "kube-api-access-2hgk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.270168 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1708e39a-582c-42e2-8c2e-d71fef75a183-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1708e39a-582c-42e2-8c2e-d71fef75a183" (UID: "1708e39a-582c-42e2-8c2e-d71fef75a183"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.270943 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1708e39a-582c-42e2-8c2e-d71fef75a183-inventory" (OuterVolumeSpecName: "inventory") pod "1708e39a-582c-42e2-8c2e-d71fef75a183" (UID: "1708e39a-582c-42e2-8c2e-d71fef75a183"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.335539 4720 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1708e39a-582c-42e2-8c2e-d71fef75a183-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.335567 4720 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1708e39a-582c-42e2-8c2e-d71fef75a183-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.335576 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hgk5\" (UniqueName: \"kubernetes.io/projected/1708e39a-582c-42e2-8c2e-d71fef75a183-kube-api-access-2hgk5\") on node \"crc\" DevicePath \"\"" Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.790217 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd" event={"ID":"1708e39a-582c-42e2-8c2e-d71fef75a183","Type":"ContainerDied","Data":"85968b6c8a32e79bef64859e9cbfa0fb3f67e8e1d38a94743b27db0c0bab4ed8"} Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.790268 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85968b6c8a32e79bef64859e9cbfa0fb3f67e8e1d38a94743b27db0c0bab4ed8" Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.790345 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd" Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.892437 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg"] Jan 21 14:58:10 crc kubenswrapper[4720]: E0121 14:58:10.892835 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1708e39a-582c-42e2-8c2e-d71fef75a183" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.892858 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="1708e39a-582c-42e2-8c2e-d71fef75a183" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.893077 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="1708e39a-582c-42e2-8c2e-d71fef75a183" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.893814 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg" Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.903067 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg"] Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.934119 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.934340 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.934544 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.938309 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-q5rkp" Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.961750 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56fqn\" (UniqueName: \"kubernetes.io/projected/5c493941-48f3-4a3e-a66a-4f045487005e-kube-api-access-56fqn\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rwbjg\" (UID: \"5c493941-48f3-4a3e-a66a-4f045487005e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg" Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.961806 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5c493941-48f3-4a3e-a66a-4f045487005e-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rwbjg\" (UID: \"5c493941-48f3-4a3e-a66a-4f045487005e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg" Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.961846 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c493941-48f3-4a3e-a66a-4f045487005e-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rwbjg\" (UID: \"5c493941-48f3-4a3e-a66a-4f045487005e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg" Jan 21 14:58:11 crc kubenswrapper[4720]: I0121 14:58:11.063846 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56fqn\" (UniqueName: \"kubernetes.io/projected/5c493941-48f3-4a3e-a66a-4f045487005e-kube-api-access-56fqn\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rwbjg\" (UID: \"5c493941-48f3-4a3e-a66a-4f045487005e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg" Jan 21 14:58:11 crc kubenswrapper[4720]: I0121 14:58:11.064437 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5c493941-48f3-4a3e-a66a-4f045487005e-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rwbjg\" (UID: \"5c493941-48f3-4a3e-a66a-4f045487005e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg" Jan 21 14:58:11 crc kubenswrapper[4720]: I0121 14:58:11.064631 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c493941-48f3-4a3e-a66a-4f045487005e-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rwbjg\" (UID: \"5c493941-48f3-4a3e-a66a-4f045487005e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg" Jan 21 14:58:11 crc kubenswrapper[4720]: I0121 14:58:11.069502 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5c493941-48f3-4a3e-a66a-4f045487005e-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rwbjg\" (UID: \"5c493941-48f3-4a3e-a66a-4f045487005e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg" Jan 21 14:58:11 crc kubenswrapper[4720]: I0121 14:58:11.070555 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c493941-48f3-4a3e-a66a-4f045487005e-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rwbjg\" (UID: \"5c493941-48f3-4a3e-a66a-4f045487005e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg" Jan 21 14:58:11 crc kubenswrapper[4720]: I0121 14:58:11.079966 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56fqn\" (UniqueName: \"kubernetes.io/projected/5c493941-48f3-4a3e-a66a-4f045487005e-kube-api-access-56fqn\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rwbjg\" (UID: \"5c493941-48f3-4a3e-a66a-4f045487005e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg" Jan 21 14:58:11 crc kubenswrapper[4720]: I0121 14:58:11.262183 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg" Jan 21 14:58:11 crc kubenswrapper[4720]: I0121 14:58:11.787188 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg"] Jan 21 14:58:12 crc kubenswrapper[4720]: I0121 14:58:12.827182 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg" event={"ID":"5c493941-48f3-4a3e-a66a-4f045487005e","Type":"ContainerStarted","Data":"84b7c1102ed8905c57b99200f382f83610f148178a176ed251bd4301f0f84e8b"} Jan 21 14:58:12 crc kubenswrapper[4720]: I0121 14:58:12.827505 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg" event={"ID":"5c493941-48f3-4a3e-a66a-4f045487005e","Type":"ContainerStarted","Data":"aed7e6e3d2bc12e37696131e0b0339af7c33a47dc639b79b4f4ace583ab25aeb"} Jan 21 14:58:12 crc kubenswrapper[4720]: I0121 14:58:12.854708 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg" podStartSLOduration=2.424295347 podStartE2EDuration="2.854687183s" podCreationTimestamp="2026-01-21 14:58:10 +0000 UTC" firstStartedPulling="2026-01-21 14:58:11.811067972 +0000 UTC m=+1729.719807904" lastFinishedPulling="2026-01-21 14:58:12.241459758 +0000 UTC m=+1730.150199740" observedRunningTime="2026-01-21 14:58:12.845387518 +0000 UTC m=+1730.754127470" watchObservedRunningTime="2026-01-21 14:58:12.854687183 +0000 UTC m=+1730.763427135" Jan 21 14:58:17 crc kubenswrapper[4720]: I0121 14:58:17.678923 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 14:58:17 crc kubenswrapper[4720]: E0121 14:58:17.679897 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 14:58:20 crc kubenswrapper[4720]: I0121 14:58:20.049831 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-fhvrr"] Jan 21 14:58:20 crc kubenswrapper[4720]: I0121 14:58:20.063398 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-fhvrr"] Jan 21 14:58:20 crc kubenswrapper[4720]: I0121 14:58:20.689587 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a6c6de6-8f88-4c87-bd8e-46579996948e" path="/var/lib/kubelet/pods/7a6c6de6-8f88-4c87-bd8e-46579996948e/volumes" Jan 21 14:58:22 crc kubenswrapper[4720]: I0121 14:58:22.048093 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-fh44q"] Jan 21 14:58:22 crc kubenswrapper[4720]: I0121 14:58:22.060259 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-fh44q"] Jan 21 14:58:22 crc kubenswrapper[4720]: I0121 14:58:22.695027 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72a4a042-08eb-4644-81c0-2cfcd105cf2b" path="/var/lib/kubelet/pods/72a4a042-08eb-4644-81c0-2cfcd105cf2b/volumes" Jan 21 14:58:31 crc kubenswrapper[4720]: I0121 14:58:31.043202 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-lsn2k"] Jan 21 14:58:31 crc kubenswrapper[4720]: I0121 14:58:31.060730 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-lsn2k"] Jan 21 14:58:31 crc kubenswrapper[4720]: I0121 14:58:31.678623 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 14:58:31 crc kubenswrapper[4720]: E0121 14:58:31.678922 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 14:58:32 crc kubenswrapper[4720]: I0121 14:58:32.688315 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03e400cd-53d2-4738-96f0-75829e339879" path="/var/lib/kubelet/pods/03e400cd-53d2-4738-96f0-75829e339879/volumes" Jan 21 14:58:38 crc kubenswrapper[4720]: I0121 14:58:38.030711 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-wtr5d"] Jan 21 14:58:38 crc kubenswrapper[4720]: I0121 14:58:38.039779 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-wtr5d"] Jan 21 14:58:38 crc kubenswrapper[4720]: I0121 14:58:38.687541 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2eaf7930-34cf-4396-9b94-c09d3a5da09a" path="/var/lib/kubelet/pods/2eaf7930-34cf-4396-9b94-c09d3a5da09a/volumes" Jan 21 14:58:39 crc kubenswrapper[4720]: I0121 14:58:39.030998 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-vz5k2"] Jan 21 14:58:39 crc kubenswrapper[4720]: I0121 14:58:39.041916 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-vz5k2"] Jan 21 14:58:40 crc kubenswrapper[4720]: I0121 14:58:40.698041 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d468a637-b18d-47fd-9b04-910dba72a955" path="/var/lib/kubelet/pods/d468a637-b18d-47fd-9b04-910dba72a955/volumes" Jan 21 14:58:44 crc kubenswrapper[4720]: I0121 14:58:44.678621 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 14:58:44 crc kubenswrapper[4720]: E0121 14:58:44.679384 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 14:58:57 crc kubenswrapper[4720]: I0121 14:58:57.678422 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 14:58:57 crc kubenswrapper[4720]: E0121 14:58:57.680206 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 14:58:58 crc kubenswrapper[4720]: I0121 14:58:58.211770 4720 generic.go:334] "Generic (PLEG): container finished" podID="5c493941-48f3-4a3e-a66a-4f045487005e" containerID="84b7c1102ed8905c57b99200f382f83610f148178a176ed251bd4301f0f84e8b" exitCode=0 Jan 21 14:58:58 crc kubenswrapper[4720]: I0121 14:58:58.212263 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg" event={"ID":"5c493941-48f3-4a3e-a66a-4f045487005e","Type":"ContainerDied","Data":"84b7c1102ed8905c57b99200f382f83610f148178a176ed251bd4301f0f84e8b"} Jan 21 14:58:59 crc kubenswrapper[4720]: I0121 14:58:59.132340 4720 scope.go:117] "RemoveContainer" containerID="e157ae31f96b07ea02b29f98dae94eb3d7d5795415a495d65698b9a5085c7130" Jan 21 14:58:59 crc kubenswrapper[4720]: I0121 14:58:59.199943 4720 scope.go:117] "RemoveContainer" containerID="aa36f5e3e3dbee78955e3cde60ca553a839782ae810aaa8a755ce96f2d298234" Jan 21 14:58:59 crc kubenswrapper[4720]: I0121 14:58:59.265168 4720 scope.go:117] "RemoveContainer" containerID="da30657364957537118b3484996473e61293d2c96c58d296138cfcceba62bd38" Jan 21 14:58:59 crc kubenswrapper[4720]: I0121 14:58:59.306921 4720 scope.go:117] "RemoveContainer" containerID="4cbcc32aeb798aaa7b0d77c7b3bd3ce53ec4708a0626d5329246f49a64fe4d07" Jan 21 14:58:59 crc kubenswrapper[4720]: I0121 14:58:59.358121 4720 scope.go:117] "RemoveContainer" containerID="4e24f13f0ad5e20e473814aa465820a276b981501127e963947f9007b3bccb91" Jan 21 14:58:59 crc kubenswrapper[4720]: I0121 14:58:59.553539 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg" Jan 21 14:58:59 crc kubenswrapper[4720]: I0121 14:58:59.632858 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5c493941-48f3-4a3e-a66a-4f045487005e-ssh-key-openstack-edpm-ipam\") pod \"5c493941-48f3-4a3e-a66a-4f045487005e\" (UID: \"5c493941-48f3-4a3e-a66a-4f045487005e\") " Jan 21 14:58:59 crc kubenswrapper[4720]: I0121 14:58:59.632951 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56fqn\" (UniqueName: \"kubernetes.io/projected/5c493941-48f3-4a3e-a66a-4f045487005e-kube-api-access-56fqn\") pod \"5c493941-48f3-4a3e-a66a-4f045487005e\" (UID: \"5c493941-48f3-4a3e-a66a-4f045487005e\") " Jan 21 14:58:59 crc kubenswrapper[4720]: I0121 14:58:59.633004 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c493941-48f3-4a3e-a66a-4f045487005e-inventory\") pod \"5c493941-48f3-4a3e-a66a-4f045487005e\" (UID: \"5c493941-48f3-4a3e-a66a-4f045487005e\") " Jan 21 14:58:59 crc kubenswrapper[4720]: I0121 14:58:59.639321 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c493941-48f3-4a3e-a66a-4f045487005e-kube-api-access-56fqn" (OuterVolumeSpecName: "kube-api-access-56fqn") pod "5c493941-48f3-4a3e-a66a-4f045487005e" (UID: "5c493941-48f3-4a3e-a66a-4f045487005e"). InnerVolumeSpecName "kube-api-access-56fqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:58:59 crc kubenswrapper[4720]: I0121 14:58:59.657703 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c493941-48f3-4a3e-a66a-4f045487005e-inventory" (OuterVolumeSpecName: "inventory") pod "5c493941-48f3-4a3e-a66a-4f045487005e" (UID: "5c493941-48f3-4a3e-a66a-4f045487005e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:58:59 crc kubenswrapper[4720]: I0121 14:58:59.663520 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c493941-48f3-4a3e-a66a-4f045487005e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5c493941-48f3-4a3e-a66a-4f045487005e" (UID: "5c493941-48f3-4a3e-a66a-4f045487005e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:58:59 crc kubenswrapper[4720]: I0121 14:58:59.735289 4720 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5c493941-48f3-4a3e-a66a-4f045487005e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 14:58:59 crc kubenswrapper[4720]: I0121 14:58:59.735329 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56fqn\" (UniqueName: \"kubernetes.io/projected/5c493941-48f3-4a3e-a66a-4f045487005e-kube-api-access-56fqn\") on node \"crc\" DevicePath \"\"" Jan 21 14:58:59 crc kubenswrapper[4720]: I0121 14:58:59.735343 4720 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c493941-48f3-4a3e-a66a-4f045487005e-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.258541 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg" event={"ID":"5c493941-48f3-4a3e-a66a-4f045487005e","Type":"ContainerDied","Data":"aed7e6e3d2bc12e37696131e0b0339af7c33a47dc639b79b4f4ace583ab25aeb"} Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.258888 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aed7e6e3d2bc12e37696131e0b0339af7c33a47dc639b79b4f4ace583ab25aeb" Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.258587 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg" Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.317640 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9"] Jan 21 14:59:00 crc kubenswrapper[4720]: E0121 14:59:00.318012 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c493941-48f3-4a3e-a66a-4f045487005e" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.318029 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c493941-48f3-4a3e-a66a-4f045487005e" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.318222 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c493941-48f3-4a3e-a66a-4f045487005e" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.318785 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9" Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.320955 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-q5rkp" Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.325128 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.325282 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.325360 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.333240 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9"] Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.455600 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4kwv\" (UniqueName: \"kubernetes.io/projected/09ee2ae5-f10a-4080-90df-29c01525e871-kube-api-access-n4kwv\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9\" (UID: \"09ee2ae5-f10a-4080-90df-29c01525e871\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9" Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.455967 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/09ee2ae5-f10a-4080-90df-29c01525e871-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9\" (UID: \"09ee2ae5-f10a-4080-90df-29c01525e871\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9" Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.456104 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09ee2ae5-f10a-4080-90df-29c01525e871-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9\" (UID: \"09ee2ae5-f10a-4080-90df-29c01525e871\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9" Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.557415 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4kwv\" (UniqueName: \"kubernetes.io/projected/09ee2ae5-f10a-4080-90df-29c01525e871-kube-api-access-n4kwv\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9\" (UID: \"09ee2ae5-f10a-4080-90df-29c01525e871\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9" Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.557497 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/09ee2ae5-f10a-4080-90df-29c01525e871-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9\" (UID: \"09ee2ae5-f10a-4080-90df-29c01525e871\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9" Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.557568 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09ee2ae5-f10a-4080-90df-29c01525e871-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9\" (UID: \"09ee2ae5-f10a-4080-90df-29c01525e871\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9" Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.563871 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09ee2ae5-f10a-4080-90df-29c01525e871-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9\" (UID: \"09ee2ae5-f10a-4080-90df-29c01525e871\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9" Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.563945 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/09ee2ae5-f10a-4080-90df-29c01525e871-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9\" (UID: \"09ee2ae5-f10a-4080-90df-29c01525e871\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9" Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.574207 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4kwv\" (UniqueName: \"kubernetes.io/projected/09ee2ae5-f10a-4080-90df-29c01525e871-kube-api-access-n4kwv\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9\" (UID: \"09ee2ae5-f10a-4080-90df-29c01525e871\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9" Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.638636 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9" Jan 21 14:59:01 crc kubenswrapper[4720]: I0121 14:59:01.167075 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9"] Jan 21 14:59:01 crc kubenswrapper[4720]: I0121 14:59:01.267728 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9" event={"ID":"09ee2ae5-f10a-4080-90df-29c01525e871","Type":"ContainerStarted","Data":"0b0ae62ba40bf3a5bedeb2c32f16a9363d168087cad8a0ff8c897eda0877c576"} Jan 21 14:59:02 crc kubenswrapper[4720]: I0121 14:59:02.276954 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9" event={"ID":"09ee2ae5-f10a-4080-90df-29c01525e871","Type":"ContainerStarted","Data":"0c95cea8482b0819d7a21f58cf98e8eeec4346801cb75dd588a65b0836fd1afe"} Jan 21 14:59:02 crc kubenswrapper[4720]: I0121 14:59:02.301007 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9" podStartSLOduration=1.655538593 podStartE2EDuration="2.300986759s" podCreationTimestamp="2026-01-21 14:59:00 +0000 UTC" firstStartedPulling="2026-01-21 14:59:01.178130043 +0000 UTC m=+1779.086869995" lastFinishedPulling="2026-01-21 14:59:01.823578229 +0000 UTC m=+1779.732318161" observedRunningTime="2026-01-21 14:59:02.297879409 +0000 UTC m=+1780.206619361" watchObservedRunningTime="2026-01-21 14:59:02.300986759 +0000 UTC m=+1780.209726691" Jan 21 14:59:06 crc kubenswrapper[4720]: I0121 14:59:06.310713 4720 generic.go:334] "Generic (PLEG): container finished" podID="09ee2ae5-f10a-4080-90df-29c01525e871" containerID="0c95cea8482b0819d7a21f58cf98e8eeec4346801cb75dd588a65b0836fd1afe" exitCode=0 Jan 21 14:59:06 crc kubenswrapper[4720]: I0121 14:59:06.311771 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9" event={"ID":"09ee2ae5-f10a-4080-90df-29c01525e871","Type":"ContainerDied","Data":"0c95cea8482b0819d7a21f58cf98e8eeec4346801cb75dd588a65b0836fd1afe"} Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.312548 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.335013 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9" event={"ID":"09ee2ae5-f10a-4080-90df-29c01525e871","Type":"ContainerDied","Data":"0b0ae62ba40bf3a5bedeb2c32f16a9363d168087cad8a0ff8c897eda0877c576"} Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.335052 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b0ae62ba40bf3a5bedeb2c32f16a9363d168087cad8a0ff8c897eda0877c576" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.335106 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.415527 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4kwv\" (UniqueName: \"kubernetes.io/projected/09ee2ae5-f10a-4080-90df-29c01525e871-kube-api-access-n4kwv\") pod \"09ee2ae5-f10a-4080-90df-29c01525e871\" (UID: \"09ee2ae5-f10a-4080-90df-29c01525e871\") " Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.415611 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/09ee2ae5-f10a-4080-90df-29c01525e871-ssh-key-openstack-edpm-ipam\") pod \"09ee2ae5-f10a-4080-90df-29c01525e871\" (UID: \"09ee2ae5-f10a-4080-90df-29c01525e871\") " Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.415644 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09ee2ae5-f10a-4080-90df-29c01525e871-inventory\") pod \"09ee2ae5-f10a-4080-90df-29c01525e871\" (UID: \"09ee2ae5-f10a-4080-90df-29c01525e871\") " Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.427224 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ee2ae5-f10a-4080-90df-29c01525e871-kube-api-access-n4kwv" (OuterVolumeSpecName: "kube-api-access-n4kwv") pod "09ee2ae5-f10a-4080-90df-29c01525e871" (UID: "09ee2ae5-f10a-4080-90df-29c01525e871"). InnerVolumeSpecName "kube-api-access-n4kwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.435194 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq"] Jan 21 14:59:08 crc kubenswrapper[4720]: E0121 14:59:08.435552 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09ee2ae5-f10a-4080-90df-29c01525e871" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.435567 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="09ee2ae5-f10a-4080-90df-29c01525e871" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.435810 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="09ee2ae5-f10a-4080-90df-29c01525e871" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.436566 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.484589 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ee2ae5-f10a-4080-90df-29c01525e871-inventory" (OuterVolumeSpecName: "inventory") pod "09ee2ae5-f10a-4080-90df-29c01525e871" (UID: "09ee2ae5-f10a-4080-90df-29c01525e871"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.488990 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ee2ae5-f10a-4080-90df-29c01525e871-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "09ee2ae5-f10a-4080-90df-29c01525e871" (UID: "09ee2ae5-f10a-4080-90df-29c01525e871"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.490719 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq"] Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.520718 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4kwv\" (UniqueName: \"kubernetes.io/projected/09ee2ae5-f10a-4080-90df-29c01525e871-kube-api-access-n4kwv\") on node \"crc\" DevicePath \"\"" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.520752 4720 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/09ee2ae5-f10a-4080-90df-29c01525e871-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.520761 4720 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09ee2ae5-f10a-4080-90df-29c01525e871-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.622310 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e910d6d-e1c9-447a-9584-0338f9151f26-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq\" (UID: \"5e910d6d-e1c9-447a-9584-0338f9151f26\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.622362 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4gq5\" (UniqueName: \"kubernetes.io/projected/5e910d6d-e1c9-447a-9584-0338f9151f26-kube-api-access-k4gq5\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq\" (UID: \"5e910d6d-e1c9-447a-9584-0338f9151f26\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.622392 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e910d6d-e1c9-447a-9584-0338f9151f26-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq\" (UID: \"5e910d6d-e1c9-447a-9584-0338f9151f26\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.723670 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e910d6d-e1c9-447a-9584-0338f9151f26-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq\" (UID: \"5e910d6d-e1c9-447a-9584-0338f9151f26\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.723709 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4gq5\" (UniqueName: \"kubernetes.io/projected/5e910d6d-e1c9-447a-9584-0338f9151f26-kube-api-access-k4gq5\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq\" (UID: \"5e910d6d-e1c9-447a-9584-0338f9151f26\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.723733 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e910d6d-e1c9-447a-9584-0338f9151f26-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq\" (UID: \"5e910d6d-e1c9-447a-9584-0338f9151f26\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.729178 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e910d6d-e1c9-447a-9584-0338f9151f26-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq\" (UID: \"5e910d6d-e1c9-447a-9584-0338f9151f26\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.729422 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e910d6d-e1c9-447a-9584-0338f9151f26-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq\" (UID: \"5e910d6d-e1c9-447a-9584-0338f9151f26\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.740985 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4gq5\" (UniqueName: \"kubernetes.io/projected/5e910d6d-e1c9-447a-9584-0338f9151f26-kube-api-access-k4gq5\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq\" (UID: \"5e910d6d-e1c9-447a-9584-0338f9151f26\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.866422 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq" Jan 21 14:59:10 crc kubenswrapper[4720]: I0121 14:59:10.001151 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq"] Jan 21 14:59:10 crc kubenswrapper[4720]: I0121 14:59:10.352128 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq" event={"ID":"5e910d6d-e1c9-447a-9584-0338f9151f26","Type":"ContainerStarted","Data":"e5d1100794969fb7a90ec0d0a1822e837ef096202c22d88f476ed2c30b64dd65"} Jan 21 14:59:11 crc kubenswrapper[4720]: I0121 14:59:11.361155 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq" event={"ID":"5e910d6d-e1c9-447a-9584-0338f9151f26","Type":"ContainerStarted","Data":"15632d2747d70d997cd9421ae03a766ab0f8b8e86525dad4d08ea842212ff453"} Jan 21 14:59:11 crc kubenswrapper[4720]: I0121 14:59:11.387576 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq" podStartSLOduration=3.005119009 podStartE2EDuration="3.38755669s" podCreationTimestamp="2026-01-21 14:59:08 +0000 UTC" firstStartedPulling="2026-01-21 14:59:10.014227718 +0000 UTC m=+1787.922967650" lastFinishedPulling="2026-01-21 14:59:10.396665399 +0000 UTC m=+1788.305405331" observedRunningTime="2026-01-21 14:59:11.381217568 +0000 UTC m=+1789.289957530" watchObservedRunningTime="2026-01-21 14:59:11.38755669 +0000 UTC m=+1789.296296622" Jan 21 14:59:11 crc kubenswrapper[4720]: I0121 14:59:11.678318 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 14:59:11 crc kubenswrapper[4720]: E0121 14:59:11.678589 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 14:59:23 crc kubenswrapper[4720]: I0121 14:59:23.678530 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 14:59:23 crc kubenswrapper[4720]: E0121 14:59:23.679169 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 14:59:32 crc kubenswrapper[4720]: I0121 14:59:32.054696 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-61ab-account-create-update-4mch7"] Jan 21 14:59:32 crc kubenswrapper[4720]: I0121 14:59:32.073270 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-62k9x"] Jan 21 14:59:32 crc kubenswrapper[4720]: I0121 14:59:32.082722 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-c5zqd"] Jan 21 14:59:32 crc kubenswrapper[4720]: I0121 14:59:32.095492 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-b472-account-create-update-cmqsp"] Jan 21 14:59:32 crc kubenswrapper[4720]: I0121 14:59:32.105248 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-c5zqd"] Jan 21 14:59:32 crc kubenswrapper[4720]: I0121 14:59:32.114506 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-d9b2-account-create-update-dld7b"] Jan 21 14:59:32 crc kubenswrapper[4720]: I0121 14:59:32.123866 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-62k9x"] Jan 21 14:59:32 crc kubenswrapper[4720]: I0121 14:59:32.131863 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-99kbn"] Jan 21 14:59:32 crc kubenswrapper[4720]: I0121 14:59:32.139333 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-d9b2-account-create-update-dld7b"] Jan 21 14:59:32 crc kubenswrapper[4720]: I0121 14:59:32.145230 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-61ab-account-create-update-4mch7"] Jan 21 14:59:32 crc kubenswrapper[4720]: I0121 14:59:32.150755 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-b472-account-create-update-cmqsp"] Jan 21 14:59:32 crc kubenswrapper[4720]: I0121 14:59:32.156324 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-99kbn"] Jan 21 14:59:32 crc kubenswrapper[4720]: I0121 14:59:32.692912 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01f8146d-b3dd-48a4-b1a8-9fa590c0d808" path="/var/lib/kubelet/pods/01f8146d-b3dd-48a4-b1a8-9fa590c0d808/volumes" Jan 21 14:59:32 crc kubenswrapper[4720]: I0121 14:59:32.693832 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a08abcad-85f1-431b-853e-3599eebed756" path="/var/lib/kubelet/pods/a08abcad-85f1-431b-853e-3599eebed756/volumes" Jan 21 14:59:32 crc kubenswrapper[4720]: I0121 14:59:32.694750 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a12f971e-bd5e-4b60-9d28-06c786d852ae" path="/var/lib/kubelet/pods/a12f971e-bd5e-4b60-9d28-06c786d852ae/volumes" Jan 21 14:59:32 crc kubenswrapper[4720]: I0121 14:59:32.695562 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad73ec2f-ba76-4451-8202-33403a41de12" path="/var/lib/kubelet/pods/ad73ec2f-ba76-4451-8202-33403a41de12/volumes" Jan 21 14:59:32 crc kubenswrapper[4720]: I0121 14:59:32.697621 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af31d5e0-11e6-433b-a31e-bea14d7e5c95" path="/var/lib/kubelet/pods/af31d5e0-11e6-433b-a31e-bea14d7e5c95/volumes" Jan 21 14:59:32 crc kubenswrapper[4720]: I0121 14:59:32.698864 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9cf579e-cb45-4984-8558-107b9576d977" path="/var/lib/kubelet/pods/d9cf579e-cb45-4984-8558-107b9576d977/volumes" Jan 21 14:59:36 crc kubenswrapper[4720]: I0121 14:59:36.679183 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 14:59:36 crc kubenswrapper[4720]: E0121 14:59:36.680108 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 14:59:51 crc kubenswrapper[4720]: I0121 14:59:51.678694 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 14:59:51 crc kubenswrapper[4720]: E0121 14:59:51.680921 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 14:59:59 crc kubenswrapper[4720]: I0121 14:59:59.517001 4720 scope.go:117] "RemoveContainer" containerID="3c91133a01a4614de36a8d666a7d07c7ef46c013dcc30aab91a584e4c3f9d821" Jan 21 14:59:59 crc kubenswrapper[4720]: I0121 14:59:59.550203 4720 scope.go:117] "RemoveContainer" containerID="72ca4e3efda677c6d5505c06f76f801874dedd82499a86395269317817d91b41" Jan 21 14:59:59 crc kubenswrapper[4720]: I0121 14:59:59.591168 4720 scope.go:117] "RemoveContainer" containerID="640739b09d2283081f0c3b2a06de0e2de45e7dd328c1f454ca1fe542c003fad9" Jan 21 14:59:59 crc kubenswrapper[4720]: I0121 14:59:59.630705 4720 scope.go:117] "RemoveContainer" containerID="76e56b7b117cebd65fc0e8a56b27da7c2b84bd042ac8ed9b2babbcdfb78864a1" Jan 21 14:59:59 crc kubenswrapper[4720]: I0121 14:59:59.674345 4720 scope.go:117] "RemoveContainer" containerID="8c3eb39f9b9627b072a3900c90555cd68e5d7daab86658e513ca3c054e6b4044" Jan 21 14:59:59 crc kubenswrapper[4720]: I0121 14:59:59.710074 4720 scope.go:117] "RemoveContainer" containerID="582c2f5a67c5087ceb2090b4f845673a61d252b5b4bb8a1030a72f2c63755ab3" Jan 21 15:00:00 crc kubenswrapper[4720]: I0121 15:00:00.152520 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483460-95ltn"] Jan 21 15:00:00 crc kubenswrapper[4720]: I0121 15:00:00.153598 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-95ltn" Jan 21 15:00:00 crc kubenswrapper[4720]: I0121 15:00:00.156846 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 21 15:00:00 crc kubenswrapper[4720]: I0121 15:00:00.171685 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483460-95ltn"] Jan 21 15:00:00 crc kubenswrapper[4720]: I0121 15:00:00.191674 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 21 15:00:00 crc kubenswrapper[4720]: I0121 15:00:00.312986 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d92d32a0-256b-4078-a4cf-fe678205141c-secret-volume\") pod \"collect-profiles-29483460-95ltn\" (UID: \"d92d32a0-256b-4078-a4cf-fe678205141c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-95ltn" Jan 21 15:00:00 crc kubenswrapper[4720]: I0121 15:00:00.313039 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d92d32a0-256b-4078-a4cf-fe678205141c-config-volume\") pod \"collect-profiles-29483460-95ltn\" (UID: \"d92d32a0-256b-4078-a4cf-fe678205141c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-95ltn" Jan 21 15:00:00 crc kubenswrapper[4720]: I0121 15:00:00.313076 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvltz\" (UniqueName: \"kubernetes.io/projected/d92d32a0-256b-4078-a4cf-fe678205141c-kube-api-access-cvltz\") pod \"collect-profiles-29483460-95ltn\" (UID: \"d92d32a0-256b-4078-a4cf-fe678205141c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-95ltn" Jan 21 15:00:00 crc kubenswrapper[4720]: I0121 15:00:00.415022 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d92d32a0-256b-4078-a4cf-fe678205141c-secret-volume\") pod \"collect-profiles-29483460-95ltn\" (UID: \"d92d32a0-256b-4078-a4cf-fe678205141c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-95ltn" Jan 21 15:00:00 crc kubenswrapper[4720]: I0121 15:00:00.415079 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d92d32a0-256b-4078-a4cf-fe678205141c-config-volume\") pod \"collect-profiles-29483460-95ltn\" (UID: \"d92d32a0-256b-4078-a4cf-fe678205141c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-95ltn" Jan 21 15:00:00 crc kubenswrapper[4720]: I0121 15:00:00.415127 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvltz\" (UniqueName: \"kubernetes.io/projected/d92d32a0-256b-4078-a4cf-fe678205141c-kube-api-access-cvltz\") pod \"collect-profiles-29483460-95ltn\" (UID: \"d92d32a0-256b-4078-a4cf-fe678205141c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-95ltn" Jan 21 15:00:00 crc kubenswrapper[4720]: I0121 15:00:00.416289 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d92d32a0-256b-4078-a4cf-fe678205141c-config-volume\") pod \"collect-profiles-29483460-95ltn\" (UID: \"d92d32a0-256b-4078-a4cf-fe678205141c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-95ltn" Jan 21 15:00:00 crc kubenswrapper[4720]: I0121 15:00:00.421093 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d92d32a0-256b-4078-a4cf-fe678205141c-secret-volume\") pod \"collect-profiles-29483460-95ltn\" (UID: \"d92d32a0-256b-4078-a4cf-fe678205141c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-95ltn" Jan 21 15:00:00 crc kubenswrapper[4720]: I0121 15:00:00.431109 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvltz\" (UniqueName: \"kubernetes.io/projected/d92d32a0-256b-4078-a4cf-fe678205141c-kube-api-access-cvltz\") pod \"collect-profiles-29483460-95ltn\" (UID: \"d92d32a0-256b-4078-a4cf-fe678205141c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-95ltn" Jan 21 15:00:00 crc kubenswrapper[4720]: I0121 15:00:00.514890 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-95ltn" Jan 21 15:00:00 crc kubenswrapper[4720]: I0121 15:00:00.996358 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483460-95ltn"] Jan 21 15:00:01 crc kubenswrapper[4720]: I0121 15:00:01.789457 4720 generic.go:334] "Generic (PLEG): container finished" podID="d92d32a0-256b-4078-a4cf-fe678205141c" containerID="5eabcb934e6e2604ac38974d36efe8a72af780fa2ad0365a3b5f182a6ce58b8c" exitCode=0 Jan 21 15:00:01 crc kubenswrapper[4720]: I0121 15:00:01.789566 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-95ltn" event={"ID":"d92d32a0-256b-4078-a4cf-fe678205141c","Type":"ContainerDied","Data":"5eabcb934e6e2604ac38974d36efe8a72af780fa2ad0365a3b5f182a6ce58b8c"} Jan 21 15:00:01 crc kubenswrapper[4720]: I0121 15:00:01.789970 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-95ltn" event={"ID":"d92d32a0-256b-4078-a4cf-fe678205141c","Type":"ContainerStarted","Data":"d40250c1e89baaaad3a8ab3d072992e3f510588ed754eadf4ed15205c79738a6"} Jan 21 15:00:02 crc kubenswrapper[4720]: I0121 15:00:02.686024 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 15:00:02 crc kubenswrapper[4720]: E0121 15:00:02.688922 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:00:03 crc kubenswrapper[4720]: I0121 15:00:03.092928 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-95ltn" Jan 21 15:00:03 crc kubenswrapper[4720]: I0121 15:00:03.172350 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d92d32a0-256b-4078-a4cf-fe678205141c-config-volume\") pod \"d92d32a0-256b-4078-a4cf-fe678205141c\" (UID: \"d92d32a0-256b-4078-a4cf-fe678205141c\") " Jan 21 15:00:03 crc kubenswrapper[4720]: I0121 15:00:03.172480 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvltz\" (UniqueName: \"kubernetes.io/projected/d92d32a0-256b-4078-a4cf-fe678205141c-kube-api-access-cvltz\") pod \"d92d32a0-256b-4078-a4cf-fe678205141c\" (UID: \"d92d32a0-256b-4078-a4cf-fe678205141c\") " Jan 21 15:00:03 crc kubenswrapper[4720]: I0121 15:00:03.172594 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d92d32a0-256b-4078-a4cf-fe678205141c-secret-volume\") pod \"d92d32a0-256b-4078-a4cf-fe678205141c\" (UID: \"d92d32a0-256b-4078-a4cf-fe678205141c\") " Jan 21 15:00:03 crc kubenswrapper[4720]: I0121 15:00:03.173751 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d92d32a0-256b-4078-a4cf-fe678205141c-config-volume" (OuterVolumeSpecName: "config-volume") pod "d92d32a0-256b-4078-a4cf-fe678205141c" (UID: "d92d32a0-256b-4078-a4cf-fe678205141c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:00:03 crc kubenswrapper[4720]: I0121 15:00:03.180986 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d92d32a0-256b-4078-a4cf-fe678205141c-kube-api-access-cvltz" (OuterVolumeSpecName: "kube-api-access-cvltz") pod "d92d32a0-256b-4078-a4cf-fe678205141c" (UID: "d92d32a0-256b-4078-a4cf-fe678205141c"). InnerVolumeSpecName "kube-api-access-cvltz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:00:03 crc kubenswrapper[4720]: I0121 15:00:03.184833 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d92d32a0-256b-4078-a4cf-fe678205141c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d92d32a0-256b-4078-a4cf-fe678205141c" (UID: "d92d32a0-256b-4078-a4cf-fe678205141c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:00:03 crc kubenswrapper[4720]: I0121 15:00:03.274510 4720 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d92d32a0-256b-4078-a4cf-fe678205141c-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 15:00:03 crc kubenswrapper[4720]: I0121 15:00:03.274554 4720 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d92d32a0-256b-4078-a4cf-fe678205141c-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 15:00:03 crc kubenswrapper[4720]: I0121 15:00:03.274565 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvltz\" (UniqueName: \"kubernetes.io/projected/d92d32a0-256b-4078-a4cf-fe678205141c-kube-api-access-cvltz\") on node \"crc\" DevicePath \"\"" Jan 21 15:00:03 crc kubenswrapper[4720]: I0121 15:00:03.805347 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-95ltn" event={"ID":"d92d32a0-256b-4078-a4cf-fe678205141c","Type":"ContainerDied","Data":"d40250c1e89baaaad3a8ab3d072992e3f510588ed754eadf4ed15205c79738a6"} Jan 21 15:00:03 crc kubenswrapper[4720]: I0121 15:00:03.805630 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d40250c1e89baaaad3a8ab3d072992e3f510588ed754eadf4ed15205c79738a6" Jan 21 15:00:03 crc kubenswrapper[4720]: I0121 15:00:03.805420 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-95ltn" Jan 21 15:00:09 crc kubenswrapper[4720]: I0121 15:00:09.039547 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-vm954"] Jan 21 15:00:09 crc kubenswrapper[4720]: I0121 15:00:09.052162 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-vm954"] Jan 21 15:00:10 crc kubenswrapper[4720]: I0121 15:00:10.025099 4720 generic.go:334] "Generic (PLEG): container finished" podID="5e910d6d-e1c9-447a-9584-0338f9151f26" containerID="15632d2747d70d997cd9421ae03a766ab0f8b8e86525dad4d08ea842212ff453" exitCode=0 Jan 21 15:00:10 crc kubenswrapper[4720]: I0121 15:00:10.025135 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq" event={"ID":"5e910d6d-e1c9-447a-9584-0338f9151f26","Type":"ContainerDied","Data":"15632d2747d70d997cd9421ae03a766ab0f8b8e86525dad4d08ea842212ff453"} Jan 21 15:00:10 crc kubenswrapper[4720]: I0121 15:00:10.690463 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dda8050-939a-4a64-b119-b718b60c7887" path="/var/lib/kubelet/pods/4dda8050-939a-4a64-b119-b718b60c7887/volumes" Jan 21 15:00:11 crc kubenswrapper[4720]: I0121 15:00:11.451996 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq" Jan 21 15:00:11 crc kubenswrapper[4720]: I0121 15:00:11.574420 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4gq5\" (UniqueName: \"kubernetes.io/projected/5e910d6d-e1c9-447a-9584-0338f9151f26-kube-api-access-k4gq5\") pod \"5e910d6d-e1c9-447a-9584-0338f9151f26\" (UID: \"5e910d6d-e1c9-447a-9584-0338f9151f26\") " Jan 21 15:00:11 crc kubenswrapper[4720]: I0121 15:00:11.574726 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e910d6d-e1c9-447a-9584-0338f9151f26-inventory\") pod \"5e910d6d-e1c9-447a-9584-0338f9151f26\" (UID: \"5e910d6d-e1c9-447a-9584-0338f9151f26\") " Jan 21 15:00:11 crc kubenswrapper[4720]: I0121 15:00:11.574871 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e910d6d-e1c9-447a-9584-0338f9151f26-ssh-key-openstack-edpm-ipam\") pod \"5e910d6d-e1c9-447a-9584-0338f9151f26\" (UID: \"5e910d6d-e1c9-447a-9584-0338f9151f26\") " Jan 21 15:00:11 crc kubenswrapper[4720]: I0121 15:00:11.582836 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e910d6d-e1c9-447a-9584-0338f9151f26-kube-api-access-k4gq5" (OuterVolumeSpecName: "kube-api-access-k4gq5") pod "5e910d6d-e1c9-447a-9584-0338f9151f26" (UID: "5e910d6d-e1c9-447a-9584-0338f9151f26"). InnerVolumeSpecName "kube-api-access-k4gq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:00:11 crc kubenswrapper[4720]: I0121 15:00:11.599607 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e910d6d-e1c9-447a-9584-0338f9151f26-inventory" (OuterVolumeSpecName: "inventory") pod "5e910d6d-e1c9-447a-9584-0338f9151f26" (UID: "5e910d6d-e1c9-447a-9584-0338f9151f26"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:00:11 crc kubenswrapper[4720]: I0121 15:00:11.602351 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e910d6d-e1c9-447a-9584-0338f9151f26-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5e910d6d-e1c9-447a-9584-0338f9151f26" (UID: "5e910d6d-e1c9-447a-9584-0338f9151f26"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:00:11 crc kubenswrapper[4720]: I0121 15:00:11.677478 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4gq5\" (UniqueName: \"kubernetes.io/projected/5e910d6d-e1c9-447a-9584-0338f9151f26-kube-api-access-k4gq5\") on node \"crc\" DevicePath \"\"" Jan 21 15:00:11 crc kubenswrapper[4720]: I0121 15:00:11.677530 4720 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e910d6d-e1c9-447a-9584-0338f9151f26-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 15:00:11 crc kubenswrapper[4720]: I0121 15:00:11.677543 4720 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e910d6d-e1c9-447a-9584-0338f9151f26-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.045354 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq" event={"ID":"5e910d6d-e1c9-447a-9584-0338f9151f26","Type":"ContainerDied","Data":"e5d1100794969fb7a90ec0d0a1822e837ef096202c22d88f476ed2c30b64dd65"} Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.045394 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.045402 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5d1100794969fb7a90ec0d0a1822e837ef096202c22d88f476ed2c30b64dd65" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.193241 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-4ngb6"] Jan 21 15:00:12 crc kubenswrapper[4720]: E0121 15:00:12.194077 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e910d6d-e1c9-447a-9584-0338f9151f26" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.194127 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e910d6d-e1c9-447a-9584-0338f9151f26" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 21 15:00:12 crc kubenswrapper[4720]: E0121 15:00:12.194204 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d92d32a0-256b-4078-a4cf-fe678205141c" containerName="collect-profiles" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.194218 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="d92d32a0-256b-4078-a4cf-fe678205141c" containerName="collect-profiles" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.194580 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="d92d32a0-256b-4078-a4cf-fe678205141c" containerName="collect-profiles" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.194646 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e910d6d-e1c9-447a-9584-0338f9151f26" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.195948 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-4ngb6" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.198205 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-q5rkp" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.198589 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.198731 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.199203 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.227754 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-4ngb6"] Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.288624 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfvdq\" (UniqueName: \"kubernetes.io/projected/d64c2129-c3c8-4f00-ac2f-750094e2ea79-kube-api-access-kfvdq\") pod \"ssh-known-hosts-edpm-deployment-4ngb6\" (UID: \"d64c2129-c3c8-4f00-ac2f-750094e2ea79\") " pod="openstack/ssh-known-hosts-edpm-deployment-4ngb6" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.288956 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d64c2129-c3c8-4f00-ac2f-750094e2ea79-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-4ngb6\" (UID: \"d64c2129-c3c8-4f00-ac2f-750094e2ea79\") " pod="openstack/ssh-known-hosts-edpm-deployment-4ngb6" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.289045 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d64c2129-c3c8-4f00-ac2f-750094e2ea79-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-4ngb6\" (UID: \"d64c2129-c3c8-4f00-ac2f-750094e2ea79\") " pod="openstack/ssh-known-hosts-edpm-deployment-4ngb6" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.390631 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d64c2129-c3c8-4f00-ac2f-750094e2ea79-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-4ngb6\" (UID: \"d64c2129-c3c8-4f00-ac2f-750094e2ea79\") " pod="openstack/ssh-known-hosts-edpm-deployment-4ngb6" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.390698 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d64c2129-c3c8-4f00-ac2f-750094e2ea79-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-4ngb6\" (UID: \"d64c2129-c3c8-4f00-ac2f-750094e2ea79\") " pod="openstack/ssh-known-hosts-edpm-deployment-4ngb6" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.390822 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfvdq\" (UniqueName: \"kubernetes.io/projected/d64c2129-c3c8-4f00-ac2f-750094e2ea79-kube-api-access-kfvdq\") pod \"ssh-known-hosts-edpm-deployment-4ngb6\" (UID: \"d64c2129-c3c8-4f00-ac2f-750094e2ea79\") " pod="openstack/ssh-known-hosts-edpm-deployment-4ngb6" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.395775 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d64c2129-c3c8-4f00-ac2f-750094e2ea79-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-4ngb6\" (UID: \"d64c2129-c3c8-4f00-ac2f-750094e2ea79\") " pod="openstack/ssh-known-hosts-edpm-deployment-4ngb6" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.399716 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d64c2129-c3c8-4f00-ac2f-750094e2ea79-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-4ngb6\" (UID: \"d64c2129-c3c8-4f00-ac2f-750094e2ea79\") " pod="openstack/ssh-known-hosts-edpm-deployment-4ngb6" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.420241 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfvdq\" (UniqueName: \"kubernetes.io/projected/d64c2129-c3c8-4f00-ac2f-750094e2ea79-kube-api-access-kfvdq\") pod \"ssh-known-hosts-edpm-deployment-4ngb6\" (UID: \"d64c2129-c3c8-4f00-ac2f-750094e2ea79\") " pod="openstack/ssh-known-hosts-edpm-deployment-4ngb6" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.549534 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-4ngb6" Jan 21 15:00:13 crc kubenswrapper[4720]: I0121 15:00:13.116285 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-4ngb6"] Jan 21 15:00:14 crc kubenswrapper[4720]: I0121 15:00:14.060671 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-4ngb6" event={"ID":"d64c2129-c3c8-4f00-ac2f-750094e2ea79","Type":"ContainerStarted","Data":"cc3d37e454530494f6e10cb0c5c4a654b028edb59941ddc672e9fae6eb52eaa2"} Jan 21 15:00:14 crc kubenswrapper[4720]: I0121 15:00:14.060961 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-4ngb6" event={"ID":"d64c2129-c3c8-4f00-ac2f-750094e2ea79","Type":"ContainerStarted","Data":"7f4f49ba21a269201b2f1754840f5d53404770afb029a06e3b0473599195c10e"} Jan 21 15:00:14 crc kubenswrapper[4720]: I0121 15:00:14.092161 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-4ngb6" podStartSLOduration=1.6255253939999998 podStartE2EDuration="2.092143049s" podCreationTimestamp="2026-01-21 15:00:12 +0000 UTC" firstStartedPulling="2026-01-21 15:00:13.124935763 +0000 UTC m=+1851.033675695" lastFinishedPulling="2026-01-21 15:00:13.591553418 +0000 UTC m=+1851.500293350" observedRunningTime="2026-01-21 15:00:14.086048794 +0000 UTC m=+1851.994788726" watchObservedRunningTime="2026-01-21 15:00:14.092143049 +0000 UTC m=+1852.000883001" Jan 21 15:00:16 crc kubenswrapper[4720]: I0121 15:00:16.678870 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 15:00:16 crc kubenswrapper[4720]: E0121 15:00:16.679975 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:00:21 crc kubenswrapper[4720]: I0121 15:00:21.113609 4720 generic.go:334] "Generic (PLEG): container finished" podID="d64c2129-c3c8-4f00-ac2f-750094e2ea79" containerID="cc3d37e454530494f6e10cb0c5c4a654b028edb59941ddc672e9fae6eb52eaa2" exitCode=0 Jan 21 15:00:21 crc kubenswrapper[4720]: I0121 15:00:21.113679 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-4ngb6" event={"ID":"d64c2129-c3c8-4f00-ac2f-750094e2ea79","Type":"ContainerDied","Data":"cc3d37e454530494f6e10cb0c5c4a654b028edb59941ddc672e9fae6eb52eaa2"} Jan 21 15:00:22 crc kubenswrapper[4720]: I0121 15:00:22.497458 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-4ngb6" Jan 21 15:00:22 crc kubenswrapper[4720]: I0121 15:00:22.567801 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d64c2129-c3c8-4f00-ac2f-750094e2ea79-ssh-key-openstack-edpm-ipam\") pod \"d64c2129-c3c8-4f00-ac2f-750094e2ea79\" (UID: \"d64c2129-c3c8-4f00-ac2f-750094e2ea79\") " Jan 21 15:00:22 crc kubenswrapper[4720]: I0121 15:00:22.567878 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfvdq\" (UniqueName: \"kubernetes.io/projected/d64c2129-c3c8-4f00-ac2f-750094e2ea79-kube-api-access-kfvdq\") pod \"d64c2129-c3c8-4f00-ac2f-750094e2ea79\" (UID: \"d64c2129-c3c8-4f00-ac2f-750094e2ea79\") " Jan 21 15:00:22 crc kubenswrapper[4720]: I0121 15:00:22.568603 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d64c2129-c3c8-4f00-ac2f-750094e2ea79-inventory-0\") pod \"d64c2129-c3c8-4f00-ac2f-750094e2ea79\" (UID: \"d64c2129-c3c8-4f00-ac2f-750094e2ea79\") " Jan 21 15:00:22 crc kubenswrapper[4720]: I0121 15:00:22.573179 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d64c2129-c3c8-4f00-ac2f-750094e2ea79-kube-api-access-kfvdq" (OuterVolumeSpecName: "kube-api-access-kfvdq") pod "d64c2129-c3c8-4f00-ac2f-750094e2ea79" (UID: "d64c2129-c3c8-4f00-ac2f-750094e2ea79"). InnerVolumeSpecName "kube-api-access-kfvdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:00:22 crc kubenswrapper[4720]: I0121 15:00:22.592844 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d64c2129-c3c8-4f00-ac2f-750094e2ea79-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d64c2129-c3c8-4f00-ac2f-750094e2ea79" (UID: "d64c2129-c3c8-4f00-ac2f-750094e2ea79"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:00:22 crc kubenswrapper[4720]: I0121 15:00:22.600178 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d64c2129-c3c8-4f00-ac2f-750094e2ea79-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "d64c2129-c3c8-4f00-ac2f-750094e2ea79" (UID: "d64c2129-c3c8-4f00-ac2f-750094e2ea79"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:00:22 crc kubenswrapper[4720]: I0121 15:00:22.670428 4720 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d64c2129-c3c8-4f00-ac2f-750094e2ea79-inventory-0\") on node \"crc\" DevicePath \"\"" Jan 21 15:00:22 crc kubenswrapper[4720]: I0121 15:00:22.670698 4720 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d64c2129-c3c8-4f00-ac2f-750094e2ea79-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 15:00:22 crc kubenswrapper[4720]: I0121 15:00:22.670773 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfvdq\" (UniqueName: \"kubernetes.io/projected/d64c2129-c3c8-4f00-ac2f-750094e2ea79-kube-api-access-kfvdq\") on node \"crc\" DevicePath \"\"" Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.132729 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-4ngb6" event={"ID":"d64c2129-c3c8-4f00-ac2f-750094e2ea79","Type":"ContainerDied","Data":"7f4f49ba21a269201b2f1754840f5d53404770afb029a06e3b0473599195c10e"} Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.132770 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f4f49ba21a269201b2f1754840f5d53404770afb029a06e3b0473599195c10e" Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.132808 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-4ngb6" Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.207926 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr"] Jan 21 15:00:23 crc kubenswrapper[4720]: E0121 15:00:23.208428 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d64c2129-c3c8-4f00-ac2f-750094e2ea79" containerName="ssh-known-hosts-edpm-deployment" Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.208457 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="d64c2129-c3c8-4f00-ac2f-750094e2ea79" containerName="ssh-known-hosts-edpm-deployment" Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.208749 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="d64c2129-c3c8-4f00-ac2f-750094e2ea79" containerName="ssh-known-hosts-edpm-deployment" Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.209433 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr" Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.213347 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.213387 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-q5rkp" Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.213515 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.215077 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.218082 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr"] Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.385540 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/595ce90e-f537-4d7f-be8f-a4da40103ab1-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bzgnr\" (UID: \"595ce90e-f537-4d7f-be8f-a4da40103ab1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr" Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.385599 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/595ce90e-f537-4d7f-be8f-a4da40103ab1-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bzgnr\" (UID: \"595ce90e-f537-4d7f-be8f-a4da40103ab1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr" Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.385647 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68lf7\" (UniqueName: \"kubernetes.io/projected/595ce90e-f537-4d7f-be8f-a4da40103ab1-kube-api-access-68lf7\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bzgnr\" (UID: \"595ce90e-f537-4d7f-be8f-a4da40103ab1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr" Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.488247 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/595ce90e-f537-4d7f-be8f-a4da40103ab1-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bzgnr\" (UID: \"595ce90e-f537-4d7f-be8f-a4da40103ab1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr" Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.488708 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/595ce90e-f537-4d7f-be8f-a4da40103ab1-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bzgnr\" (UID: \"595ce90e-f537-4d7f-be8f-a4da40103ab1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr" Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.488809 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68lf7\" (UniqueName: \"kubernetes.io/projected/595ce90e-f537-4d7f-be8f-a4da40103ab1-kube-api-access-68lf7\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bzgnr\" (UID: \"595ce90e-f537-4d7f-be8f-a4da40103ab1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr" Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.494864 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/595ce90e-f537-4d7f-be8f-a4da40103ab1-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bzgnr\" (UID: \"595ce90e-f537-4d7f-be8f-a4da40103ab1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr" Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.496312 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/595ce90e-f537-4d7f-be8f-a4da40103ab1-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bzgnr\" (UID: \"595ce90e-f537-4d7f-be8f-a4da40103ab1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr" Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.505624 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68lf7\" (UniqueName: \"kubernetes.io/projected/595ce90e-f537-4d7f-be8f-a4da40103ab1-kube-api-access-68lf7\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bzgnr\" (UID: \"595ce90e-f537-4d7f-be8f-a4da40103ab1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr" Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.527279 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr" Jan 21 15:00:24 crc kubenswrapper[4720]: I0121 15:00:24.027614 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr"] Jan 21 15:00:24 crc kubenswrapper[4720]: I0121 15:00:24.140580 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr" event={"ID":"595ce90e-f537-4d7f-be8f-a4da40103ab1","Type":"ContainerStarted","Data":"402afb865f19cb021e17133dd2b9ffeda4950d35fe675a6d3caff6a308ea31af"} Jan 21 15:00:25 crc kubenswrapper[4720]: I0121 15:00:25.151341 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr" event={"ID":"595ce90e-f537-4d7f-be8f-a4da40103ab1","Type":"ContainerStarted","Data":"e84f3f24ab3050fdfead3ecc7d62f7a95baea3ef007470fbf033b189c03a29a3"} Jan 21 15:00:25 crc kubenswrapper[4720]: I0121 15:00:25.182879 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr" podStartSLOduration=1.7642144960000001 podStartE2EDuration="2.182854033s" podCreationTimestamp="2026-01-21 15:00:23 +0000 UTC" firstStartedPulling="2026-01-21 15:00:24.031008296 +0000 UTC m=+1861.939748228" lastFinishedPulling="2026-01-21 15:00:24.449647823 +0000 UTC m=+1862.358387765" observedRunningTime="2026-01-21 15:00:25.173082363 +0000 UTC m=+1863.081822295" watchObservedRunningTime="2026-01-21 15:00:25.182854033 +0000 UTC m=+1863.091593975" Jan 21 15:00:30 crc kubenswrapper[4720]: I0121 15:00:30.679108 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 15:00:30 crc kubenswrapper[4720]: E0121 15:00:30.679732 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:00:33 crc kubenswrapper[4720]: I0121 15:00:33.210969 4720 generic.go:334] "Generic (PLEG): container finished" podID="595ce90e-f537-4d7f-be8f-a4da40103ab1" containerID="e84f3f24ab3050fdfead3ecc7d62f7a95baea3ef007470fbf033b189c03a29a3" exitCode=0 Jan 21 15:00:33 crc kubenswrapper[4720]: I0121 15:00:33.211082 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr" event={"ID":"595ce90e-f537-4d7f-be8f-a4da40103ab1","Type":"ContainerDied","Data":"e84f3f24ab3050fdfead3ecc7d62f7a95baea3ef007470fbf033b189c03a29a3"} Jan 21 15:00:34 crc kubenswrapper[4720]: I0121 15:00:34.044126 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-jcm9t"] Jan 21 15:00:34 crc kubenswrapper[4720]: I0121 15:00:34.054904 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-jcm9t"] Jan 21 15:00:34 crc kubenswrapper[4720]: I0121 15:00:34.639038 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr" Jan 21 15:00:34 crc kubenswrapper[4720]: I0121 15:00:34.689368 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b57a2637-15ee-4c59-881b-9364ffde9ffc" path="/var/lib/kubelet/pods/b57a2637-15ee-4c59-881b-9364ffde9ffc/volumes" Jan 21 15:00:34 crc kubenswrapper[4720]: I0121 15:00:34.723434 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/595ce90e-f537-4d7f-be8f-a4da40103ab1-inventory\") pod \"595ce90e-f537-4d7f-be8f-a4da40103ab1\" (UID: \"595ce90e-f537-4d7f-be8f-a4da40103ab1\") " Jan 21 15:00:34 crc kubenswrapper[4720]: I0121 15:00:34.723509 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/595ce90e-f537-4d7f-be8f-a4da40103ab1-ssh-key-openstack-edpm-ipam\") pod \"595ce90e-f537-4d7f-be8f-a4da40103ab1\" (UID: \"595ce90e-f537-4d7f-be8f-a4da40103ab1\") " Jan 21 15:00:34 crc kubenswrapper[4720]: I0121 15:00:34.723550 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68lf7\" (UniqueName: \"kubernetes.io/projected/595ce90e-f537-4d7f-be8f-a4da40103ab1-kube-api-access-68lf7\") pod \"595ce90e-f537-4d7f-be8f-a4da40103ab1\" (UID: \"595ce90e-f537-4d7f-be8f-a4da40103ab1\") " Jan 21 15:00:34 crc kubenswrapper[4720]: I0121 15:00:34.732942 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/595ce90e-f537-4d7f-be8f-a4da40103ab1-kube-api-access-68lf7" (OuterVolumeSpecName: "kube-api-access-68lf7") pod "595ce90e-f537-4d7f-be8f-a4da40103ab1" (UID: "595ce90e-f537-4d7f-be8f-a4da40103ab1"). InnerVolumeSpecName "kube-api-access-68lf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:00:34 crc kubenswrapper[4720]: I0121 15:00:34.753888 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/595ce90e-f537-4d7f-be8f-a4da40103ab1-inventory" (OuterVolumeSpecName: "inventory") pod "595ce90e-f537-4d7f-be8f-a4da40103ab1" (UID: "595ce90e-f537-4d7f-be8f-a4da40103ab1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:00:34 crc kubenswrapper[4720]: I0121 15:00:34.754306 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/595ce90e-f537-4d7f-be8f-a4da40103ab1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "595ce90e-f537-4d7f-be8f-a4da40103ab1" (UID: "595ce90e-f537-4d7f-be8f-a4da40103ab1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:00:34 crc kubenswrapper[4720]: I0121 15:00:34.825259 4720 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/595ce90e-f537-4d7f-be8f-a4da40103ab1-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 15:00:34 crc kubenswrapper[4720]: I0121 15:00:34.825546 4720 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/595ce90e-f537-4d7f-be8f-a4da40103ab1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 15:00:34 crc kubenswrapper[4720]: I0121 15:00:34.825636 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68lf7\" (UniqueName: \"kubernetes.io/projected/595ce90e-f537-4d7f-be8f-a4da40103ab1-kube-api-access-68lf7\") on node \"crc\" DevicePath \"\"" Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.229115 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr" event={"ID":"595ce90e-f537-4d7f-be8f-a4da40103ab1","Type":"ContainerDied","Data":"402afb865f19cb021e17133dd2b9ffeda4950d35fe675a6d3caff6a308ea31af"} Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.229170 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="402afb865f19cb021e17133dd2b9ffeda4950d35fe675a6d3caff6a308ea31af" Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.229173 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr" Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.315753 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr"] Jan 21 15:00:35 crc kubenswrapper[4720]: E0121 15:00:35.318371 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="595ce90e-f537-4d7f-be8f-a4da40103ab1" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.318397 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="595ce90e-f537-4d7f-be8f-a4da40103ab1" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.318614 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="595ce90e-f537-4d7f-be8f-a4da40103ab1" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.319445 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr" Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.324207 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.324528 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-q5rkp" Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.324645 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.326370 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.331576 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr"] Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.332979 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64e0dfca-6b74-47c9-8f6f-76de697cf3e0-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr\" (UID: \"64e0dfca-6b74-47c9-8f6f-76de697cf3e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr" Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.333077 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/64e0dfca-6b74-47c9-8f6f-76de697cf3e0-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr\" (UID: \"64e0dfca-6b74-47c9-8f6f-76de697cf3e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr" Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.333165 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mf8j\" (UniqueName: \"kubernetes.io/projected/64e0dfca-6b74-47c9-8f6f-76de697cf3e0-kube-api-access-6mf8j\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr\" (UID: \"64e0dfca-6b74-47c9-8f6f-76de697cf3e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr" Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.434271 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/64e0dfca-6b74-47c9-8f6f-76de697cf3e0-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr\" (UID: \"64e0dfca-6b74-47c9-8f6f-76de697cf3e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr" Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.434309 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mf8j\" (UniqueName: \"kubernetes.io/projected/64e0dfca-6b74-47c9-8f6f-76de697cf3e0-kube-api-access-6mf8j\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr\" (UID: \"64e0dfca-6b74-47c9-8f6f-76de697cf3e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr" Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.434424 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64e0dfca-6b74-47c9-8f6f-76de697cf3e0-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr\" (UID: \"64e0dfca-6b74-47c9-8f6f-76de697cf3e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr" Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.446544 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/64e0dfca-6b74-47c9-8f6f-76de697cf3e0-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr\" (UID: \"64e0dfca-6b74-47c9-8f6f-76de697cf3e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr" Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.454988 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64e0dfca-6b74-47c9-8f6f-76de697cf3e0-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr\" (UID: \"64e0dfca-6b74-47c9-8f6f-76de697cf3e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr" Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.463949 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mf8j\" (UniqueName: \"kubernetes.io/projected/64e0dfca-6b74-47c9-8f6f-76de697cf3e0-kube-api-access-6mf8j\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr\" (UID: \"64e0dfca-6b74-47c9-8f6f-76de697cf3e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr" Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.696274 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr" Jan 21 15:00:36 crc kubenswrapper[4720]: I0121 15:00:36.260591 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr"] Jan 21 15:00:37 crc kubenswrapper[4720]: I0121 15:00:37.243483 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr" event={"ID":"64e0dfca-6b74-47c9-8f6f-76de697cf3e0","Type":"ContainerStarted","Data":"80cadf6f3bc6b22ed4aba2049e195881690691403abe700c83d2951f830a8f6f"} Jan 21 15:00:37 crc kubenswrapper[4720]: I0121 15:00:37.243801 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr" event={"ID":"64e0dfca-6b74-47c9-8f6f-76de697cf3e0","Type":"ContainerStarted","Data":"d68a727dd916e43d3a8c2cfa254ca196f5227a82a509c28c3cfdb668d54f6731"} Jan 21 15:00:37 crc kubenswrapper[4720]: I0121 15:00:37.262382 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr" podStartSLOduration=1.802790232 podStartE2EDuration="2.262359645s" podCreationTimestamp="2026-01-21 15:00:35 +0000 UTC" firstStartedPulling="2026-01-21 15:00:36.259490547 +0000 UTC m=+1874.168230479" lastFinishedPulling="2026-01-21 15:00:36.71905996 +0000 UTC m=+1874.627799892" observedRunningTime="2026-01-21 15:00:37.257951431 +0000 UTC m=+1875.166691373" watchObservedRunningTime="2026-01-21 15:00:37.262359645 +0000 UTC m=+1875.171099597" Jan 21 15:00:38 crc kubenswrapper[4720]: I0121 15:00:38.041613 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wwmwq"] Jan 21 15:00:38 crc kubenswrapper[4720]: I0121 15:00:38.043058 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wwmwq"] Jan 21 15:00:38 crc kubenswrapper[4720]: I0121 15:00:38.689148 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f23517c-a9a1-4740-8b3b-d42b40cc8bc7" path="/var/lib/kubelet/pods/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7/volumes" Jan 21 15:00:43 crc kubenswrapper[4720]: I0121 15:00:43.678705 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 15:00:43 crc kubenswrapper[4720]: E0121 15:00:43.679222 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:00:47 crc kubenswrapper[4720]: I0121 15:00:47.326020 4720 generic.go:334] "Generic (PLEG): container finished" podID="64e0dfca-6b74-47c9-8f6f-76de697cf3e0" containerID="80cadf6f3bc6b22ed4aba2049e195881690691403abe700c83d2951f830a8f6f" exitCode=0 Jan 21 15:00:47 crc kubenswrapper[4720]: I0121 15:00:47.326089 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr" event={"ID":"64e0dfca-6b74-47c9-8f6f-76de697cf3e0","Type":"ContainerDied","Data":"80cadf6f3bc6b22ed4aba2049e195881690691403abe700c83d2951f830a8f6f"} Jan 21 15:00:48 crc kubenswrapper[4720]: I0121 15:00:48.738184 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr" Jan 21 15:00:48 crc kubenswrapper[4720]: I0121 15:00:48.869165 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64e0dfca-6b74-47c9-8f6f-76de697cf3e0-inventory\") pod \"64e0dfca-6b74-47c9-8f6f-76de697cf3e0\" (UID: \"64e0dfca-6b74-47c9-8f6f-76de697cf3e0\") " Jan 21 15:00:48 crc kubenswrapper[4720]: I0121 15:00:48.869304 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mf8j\" (UniqueName: \"kubernetes.io/projected/64e0dfca-6b74-47c9-8f6f-76de697cf3e0-kube-api-access-6mf8j\") pod \"64e0dfca-6b74-47c9-8f6f-76de697cf3e0\" (UID: \"64e0dfca-6b74-47c9-8f6f-76de697cf3e0\") " Jan 21 15:00:48 crc kubenswrapper[4720]: I0121 15:00:48.869339 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/64e0dfca-6b74-47c9-8f6f-76de697cf3e0-ssh-key-openstack-edpm-ipam\") pod \"64e0dfca-6b74-47c9-8f6f-76de697cf3e0\" (UID: \"64e0dfca-6b74-47c9-8f6f-76de697cf3e0\") " Jan 21 15:00:48 crc kubenswrapper[4720]: I0121 15:00:48.875802 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64e0dfca-6b74-47c9-8f6f-76de697cf3e0-kube-api-access-6mf8j" (OuterVolumeSpecName: "kube-api-access-6mf8j") pod "64e0dfca-6b74-47c9-8f6f-76de697cf3e0" (UID: "64e0dfca-6b74-47c9-8f6f-76de697cf3e0"). InnerVolumeSpecName "kube-api-access-6mf8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:00:48 crc kubenswrapper[4720]: I0121 15:00:48.900993 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64e0dfca-6b74-47c9-8f6f-76de697cf3e0-inventory" (OuterVolumeSpecName: "inventory") pod "64e0dfca-6b74-47c9-8f6f-76de697cf3e0" (UID: "64e0dfca-6b74-47c9-8f6f-76de697cf3e0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:00:48 crc kubenswrapper[4720]: I0121 15:00:48.910080 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64e0dfca-6b74-47c9-8f6f-76de697cf3e0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "64e0dfca-6b74-47c9-8f6f-76de697cf3e0" (UID: "64e0dfca-6b74-47c9-8f6f-76de697cf3e0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:00:48 crc kubenswrapper[4720]: I0121 15:00:48.971859 4720 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64e0dfca-6b74-47c9-8f6f-76de697cf3e0-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 15:00:48 crc kubenswrapper[4720]: I0121 15:00:48.971892 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mf8j\" (UniqueName: \"kubernetes.io/projected/64e0dfca-6b74-47c9-8f6f-76de697cf3e0-kube-api-access-6mf8j\") on node \"crc\" DevicePath \"\"" Jan 21 15:00:48 crc kubenswrapper[4720]: I0121 15:00:48.971904 4720 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/64e0dfca-6b74-47c9-8f6f-76de697cf3e0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 15:00:49 crc kubenswrapper[4720]: I0121 15:00:49.341470 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr" event={"ID":"64e0dfca-6b74-47c9-8f6f-76de697cf3e0","Type":"ContainerDied","Data":"d68a727dd916e43d3a8c2cfa254ca196f5227a82a509c28c3cfdb668d54f6731"} Jan 21 15:00:49 crc kubenswrapper[4720]: I0121 15:00:49.341505 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d68a727dd916e43d3a8c2cfa254ca196f5227a82a509c28c3cfdb668d54f6731" Jan 21 15:00:49 crc kubenswrapper[4720]: I0121 15:00:49.341511 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr" Jan 21 15:00:57 crc kubenswrapper[4720]: I0121 15:00:57.679584 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 15:00:57 crc kubenswrapper[4720]: E0121 15:00:57.680371 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:00:59 crc kubenswrapper[4720]: I0121 15:00:59.831215 4720 scope.go:117] "RemoveContainer" containerID="0cf3fdd52f65dc4830c6503325bd2251a454cdee26406e44f66cb14c6ec26e1c" Jan 21 15:00:59 crc kubenswrapper[4720]: I0121 15:00:59.875028 4720 scope.go:117] "RemoveContainer" containerID="1cacd08b92a88ab371232f39ef9e5865d3573d5d8458ae4746910cd77bac3530" Jan 21 15:00:59 crc kubenswrapper[4720]: I0121 15:00:59.934991 4720 scope.go:117] "RemoveContainer" containerID="48dd6f7a9d23c8b16a78f67df238aa1196e7c893c560fa3cacb0f6b87e00728a" Jan 21 15:01:00 crc kubenswrapper[4720]: I0121 15:01:00.151331 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29483461-qxsqs"] Jan 21 15:01:00 crc kubenswrapper[4720]: E0121 15:01:00.152012 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64e0dfca-6b74-47c9-8f6f-76de697cf3e0" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 21 15:01:00 crc kubenswrapper[4720]: I0121 15:01:00.152129 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="64e0dfca-6b74-47c9-8f6f-76de697cf3e0" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 21 15:01:00 crc kubenswrapper[4720]: I0121 15:01:00.152466 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="64e0dfca-6b74-47c9-8f6f-76de697cf3e0" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 21 15:01:00 crc kubenswrapper[4720]: I0121 15:01:00.153113 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29483461-qxsqs" Jan 21 15:01:00 crc kubenswrapper[4720]: I0121 15:01:00.166335 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29483461-qxsqs"] Jan 21 15:01:00 crc kubenswrapper[4720]: I0121 15:01:00.273345 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4d585381-d477-4c8d-af17-6194044b6de1-fernet-keys\") pod \"keystone-cron-29483461-qxsqs\" (UID: \"4d585381-d477-4c8d-af17-6194044b6de1\") " pod="openstack/keystone-cron-29483461-qxsqs" Jan 21 15:01:00 crc kubenswrapper[4720]: I0121 15:01:00.273439 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d585381-d477-4c8d-af17-6194044b6de1-config-data\") pod \"keystone-cron-29483461-qxsqs\" (UID: \"4d585381-d477-4c8d-af17-6194044b6de1\") " pod="openstack/keystone-cron-29483461-qxsqs" Jan 21 15:01:00 crc kubenswrapper[4720]: I0121 15:01:00.273494 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5vbr\" (UniqueName: \"kubernetes.io/projected/4d585381-d477-4c8d-af17-6194044b6de1-kube-api-access-q5vbr\") pod \"keystone-cron-29483461-qxsqs\" (UID: \"4d585381-d477-4c8d-af17-6194044b6de1\") " pod="openstack/keystone-cron-29483461-qxsqs" Jan 21 15:01:00 crc kubenswrapper[4720]: I0121 15:01:00.273529 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d585381-d477-4c8d-af17-6194044b6de1-combined-ca-bundle\") pod \"keystone-cron-29483461-qxsqs\" (UID: \"4d585381-d477-4c8d-af17-6194044b6de1\") " pod="openstack/keystone-cron-29483461-qxsqs" Jan 21 15:01:00 crc kubenswrapper[4720]: I0121 15:01:00.375060 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d585381-d477-4c8d-af17-6194044b6de1-config-data\") pod \"keystone-cron-29483461-qxsqs\" (UID: \"4d585381-d477-4c8d-af17-6194044b6de1\") " pod="openstack/keystone-cron-29483461-qxsqs" Jan 21 15:01:00 crc kubenswrapper[4720]: I0121 15:01:00.375128 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5vbr\" (UniqueName: \"kubernetes.io/projected/4d585381-d477-4c8d-af17-6194044b6de1-kube-api-access-q5vbr\") pod \"keystone-cron-29483461-qxsqs\" (UID: \"4d585381-d477-4c8d-af17-6194044b6de1\") " pod="openstack/keystone-cron-29483461-qxsqs" Jan 21 15:01:00 crc kubenswrapper[4720]: I0121 15:01:00.375177 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d585381-d477-4c8d-af17-6194044b6de1-combined-ca-bundle\") pod \"keystone-cron-29483461-qxsqs\" (UID: \"4d585381-d477-4c8d-af17-6194044b6de1\") " pod="openstack/keystone-cron-29483461-qxsqs" Jan 21 15:01:00 crc kubenswrapper[4720]: I0121 15:01:00.375277 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4d585381-d477-4c8d-af17-6194044b6de1-fernet-keys\") pod \"keystone-cron-29483461-qxsqs\" (UID: \"4d585381-d477-4c8d-af17-6194044b6de1\") " pod="openstack/keystone-cron-29483461-qxsqs" Jan 21 15:01:00 crc kubenswrapper[4720]: I0121 15:01:00.381523 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d585381-d477-4c8d-af17-6194044b6de1-combined-ca-bundle\") pod \"keystone-cron-29483461-qxsqs\" (UID: \"4d585381-d477-4c8d-af17-6194044b6de1\") " pod="openstack/keystone-cron-29483461-qxsqs" Jan 21 15:01:00 crc kubenswrapper[4720]: I0121 15:01:00.384456 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4d585381-d477-4c8d-af17-6194044b6de1-fernet-keys\") pod \"keystone-cron-29483461-qxsqs\" (UID: \"4d585381-d477-4c8d-af17-6194044b6de1\") " pod="openstack/keystone-cron-29483461-qxsqs" Jan 21 15:01:00 crc kubenswrapper[4720]: I0121 15:01:00.386114 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d585381-d477-4c8d-af17-6194044b6de1-config-data\") pod \"keystone-cron-29483461-qxsqs\" (UID: \"4d585381-d477-4c8d-af17-6194044b6de1\") " pod="openstack/keystone-cron-29483461-qxsqs" Jan 21 15:01:00 crc kubenswrapper[4720]: I0121 15:01:00.394276 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5vbr\" (UniqueName: \"kubernetes.io/projected/4d585381-d477-4c8d-af17-6194044b6de1-kube-api-access-q5vbr\") pod \"keystone-cron-29483461-qxsqs\" (UID: \"4d585381-d477-4c8d-af17-6194044b6de1\") " pod="openstack/keystone-cron-29483461-qxsqs" Jan 21 15:01:00 crc kubenswrapper[4720]: I0121 15:01:00.475004 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29483461-qxsqs" Jan 21 15:01:00 crc kubenswrapper[4720]: W0121 15:01:00.920402 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d585381_d477_4c8d_af17_6194044b6de1.slice/crio-7e20e2b9220b40c392054b123d87a20955becc669e8cb584e42c590030312913 WatchSource:0}: Error finding container 7e20e2b9220b40c392054b123d87a20955becc669e8cb584e42c590030312913: Status 404 returned error can't find the container with id 7e20e2b9220b40c392054b123d87a20955becc669e8cb584e42c590030312913 Jan 21 15:01:00 crc kubenswrapper[4720]: I0121 15:01:00.921742 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29483461-qxsqs"] Jan 21 15:01:01 crc kubenswrapper[4720]: I0121 15:01:01.460182 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29483461-qxsqs" event={"ID":"4d585381-d477-4c8d-af17-6194044b6de1","Type":"ContainerStarted","Data":"41ed3f5ba3eb7241bb1a9185ebb40c52b90ca854816e7caaa743d314a0bd5e57"} Jan 21 15:01:01 crc kubenswrapper[4720]: I0121 15:01:01.460558 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29483461-qxsqs" event={"ID":"4d585381-d477-4c8d-af17-6194044b6de1","Type":"ContainerStarted","Data":"7e20e2b9220b40c392054b123d87a20955becc669e8cb584e42c590030312913"} Jan 21 15:01:01 crc kubenswrapper[4720]: I0121 15:01:01.503735 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29483461-qxsqs" podStartSLOduration=1.5037143670000002 podStartE2EDuration="1.503714367s" podCreationTimestamp="2026-01-21 15:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:01:01.496669295 +0000 UTC m=+1899.405409247" watchObservedRunningTime="2026-01-21 15:01:01.503714367 +0000 UTC m=+1899.412454309" Jan 21 15:01:03 crc kubenswrapper[4720]: I0121 15:01:03.477024 4720 generic.go:334] "Generic (PLEG): container finished" podID="4d585381-d477-4c8d-af17-6194044b6de1" containerID="41ed3f5ba3eb7241bb1a9185ebb40c52b90ca854816e7caaa743d314a0bd5e57" exitCode=0 Jan 21 15:01:03 crc kubenswrapper[4720]: I0121 15:01:03.477121 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29483461-qxsqs" event={"ID":"4d585381-d477-4c8d-af17-6194044b6de1","Type":"ContainerDied","Data":"41ed3f5ba3eb7241bb1a9185ebb40c52b90ca854816e7caaa743d314a0bd5e57"} Jan 21 15:01:04 crc kubenswrapper[4720]: I0121 15:01:04.803400 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29483461-qxsqs" Jan 21 15:01:04 crc kubenswrapper[4720]: I0121 15:01:04.859283 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d585381-d477-4c8d-af17-6194044b6de1-combined-ca-bundle\") pod \"4d585381-d477-4c8d-af17-6194044b6de1\" (UID: \"4d585381-d477-4c8d-af17-6194044b6de1\") " Jan 21 15:01:04 crc kubenswrapper[4720]: I0121 15:01:04.859510 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4d585381-d477-4c8d-af17-6194044b6de1-fernet-keys\") pod \"4d585381-d477-4c8d-af17-6194044b6de1\" (UID: \"4d585381-d477-4c8d-af17-6194044b6de1\") " Jan 21 15:01:04 crc kubenswrapper[4720]: I0121 15:01:04.859541 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5vbr\" (UniqueName: \"kubernetes.io/projected/4d585381-d477-4c8d-af17-6194044b6de1-kube-api-access-q5vbr\") pod \"4d585381-d477-4c8d-af17-6194044b6de1\" (UID: \"4d585381-d477-4c8d-af17-6194044b6de1\") " Jan 21 15:01:04 crc kubenswrapper[4720]: I0121 15:01:04.859629 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d585381-d477-4c8d-af17-6194044b6de1-config-data\") pod \"4d585381-d477-4c8d-af17-6194044b6de1\" (UID: \"4d585381-d477-4c8d-af17-6194044b6de1\") " Jan 21 15:01:04 crc kubenswrapper[4720]: I0121 15:01:04.873899 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d585381-d477-4c8d-af17-6194044b6de1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4d585381-d477-4c8d-af17-6194044b6de1" (UID: "4d585381-d477-4c8d-af17-6194044b6de1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:01:04 crc kubenswrapper[4720]: I0121 15:01:04.878039 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d585381-d477-4c8d-af17-6194044b6de1-kube-api-access-q5vbr" (OuterVolumeSpecName: "kube-api-access-q5vbr") pod "4d585381-d477-4c8d-af17-6194044b6de1" (UID: "4d585381-d477-4c8d-af17-6194044b6de1"). InnerVolumeSpecName "kube-api-access-q5vbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:01:04 crc kubenswrapper[4720]: I0121 15:01:04.899054 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d585381-d477-4c8d-af17-6194044b6de1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d585381-d477-4c8d-af17-6194044b6de1" (UID: "4d585381-d477-4c8d-af17-6194044b6de1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:01:04 crc kubenswrapper[4720]: I0121 15:01:04.925173 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d585381-d477-4c8d-af17-6194044b6de1-config-data" (OuterVolumeSpecName: "config-data") pod "4d585381-d477-4c8d-af17-6194044b6de1" (UID: "4d585381-d477-4c8d-af17-6194044b6de1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:01:04 crc kubenswrapper[4720]: I0121 15:01:04.961375 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d585381-d477-4c8d-af17-6194044b6de1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:01:04 crc kubenswrapper[4720]: I0121 15:01:04.961408 4720 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4d585381-d477-4c8d-af17-6194044b6de1-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:01:04 crc kubenswrapper[4720]: I0121 15:01:04.961417 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5vbr\" (UniqueName: \"kubernetes.io/projected/4d585381-d477-4c8d-af17-6194044b6de1-kube-api-access-q5vbr\") on node \"crc\" DevicePath \"\"" Jan 21 15:01:04 crc kubenswrapper[4720]: I0121 15:01:04.961428 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d585381-d477-4c8d-af17-6194044b6de1-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:01:05 crc kubenswrapper[4720]: I0121 15:01:05.499772 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29483461-qxsqs" event={"ID":"4d585381-d477-4c8d-af17-6194044b6de1","Type":"ContainerDied","Data":"7e20e2b9220b40c392054b123d87a20955becc669e8cb584e42c590030312913"} Jan 21 15:01:05 crc kubenswrapper[4720]: I0121 15:01:05.500037 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e20e2b9220b40c392054b123d87a20955becc669e8cb584e42c590030312913" Jan 21 15:01:05 crc kubenswrapper[4720]: I0121 15:01:05.499867 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29483461-qxsqs" Jan 21 15:01:11 crc kubenswrapper[4720]: I0121 15:01:11.678367 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 15:01:11 crc kubenswrapper[4720]: E0121 15:01:11.679381 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:01:20 crc kubenswrapper[4720]: I0121 15:01:20.065347 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-7qf47"] Jan 21 15:01:20 crc kubenswrapper[4720]: I0121 15:01:20.077221 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-7qf47"] Jan 21 15:01:20 crc kubenswrapper[4720]: I0121 15:01:20.694363 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8fc07ed-67cb-4459-b7cb-ea8101ea4317" path="/var/lib/kubelet/pods/d8fc07ed-67cb-4459-b7cb-ea8101ea4317/volumes" Jan 21 15:01:25 crc kubenswrapper[4720]: I0121 15:01:25.679074 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 15:01:25 crc kubenswrapper[4720]: E0121 15:01:25.679637 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:01:29 crc kubenswrapper[4720]: I0121 15:01:29.742157 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ltcrl/must-gather-kz6gt"] Jan 21 15:01:29 crc kubenswrapper[4720]: E0121 15:01:29.742755 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d585381-d477-4c8d-af17-6194044b6de1" containerName="keystone-cron" Jan 21 15:01:29 crc kubenswrapper[4720]: I0121 15:01:29.742941 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d585381-d477-4c8d-af17-6194044b6de1" containerName="keystone-cron" Jan 21 15:01:29 crc kubenswrapper[4720]: I0121 15:01:29.743090 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d585381-d477-4c8d-af17-6194044b6de1" containerName="keystone-cron" Jan 21 15:01:29 crc kubenswrapper[4720]: I0121 15:01:29.743949 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ltcrl/must-gather-kz6gt" Jan 21 15:01:29 crc kubenswrapper[4720]: I0121 15:01:29.747373 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-ltcrl"/"kube-root-ca.crt" Jan 21 15:01:29 crc kubenswrapper[4720]: I0121 15:01:29.747878 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-ltcrl"/"openshift-service-ca.crt" Jan 21 15:01:29 crc kubenswrapper[4720]: I0121 15:01:29.768284 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ltcrl/must-gather-kz6gt"] Jan 21 15:01:29 crc kubenswrapper[4720]: I0121 15:01:29.831453 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/32ba91fa-9395-4dae-8bf6-384541b2d3ed-must-gather-output\") pod \"must-gather-kz6gt\" (UID: \"32ba91fa-9395-4dae-8bf6-384541b2d3ed\") " pod="openshift-must-gather-ltcrl/must-gather-kz6gt" Jan 21 15:01:29 crc kubenswrapper[4720]: I0121 15:01:29.831615 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvtcx\" (UniqueName: \"kubernetes.io/projected/32ba91fa-9395-4dae-8bf6-384541b2d3ed-kube-api-access-mvtcx\") pod \"must-gather-kz6gt\" (UID: \"32ba91fa-9395-4dae-8bf6-384541b2d3ed\") " pod="openshift-must-gather-ltcrl/must-gather-kz6gt" Jan 21 15:01:29 crc kubenswrapper[4720]: I0121 15:01:29.933161 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvtcx\" (UniqueName: \"kubernetes.io/projected/32ba91fa-9395-4dae-8bf6-384541b2d3ed-kube-api-access-mvtcx\") pod \"must-gather-kz6gt\" (UID: \"32ba91fa-9395-4dae-8bf6-384541b2d3ed\") " pod="openshift-must-gather-ltcrl/must-gather-kz6gt" Jan 21 15:01:29 crc kubenswrapper[4720]: I0121 15:01:29.933307 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/32ba91fa-9395-4dae-8bf6-384541b2d3ed-must-gather-output\") pod \"must-gather-kz6gt\" (UID: \"32ba91fa-9395-4dae-8bf6-384541b2d3ed\") " pod="openshift-must-gather-ltcrl/must-gather-kz6gt" Jan 21 15:01:29 crc kubenswrapper[4720]: I0121 15:01:29.933731 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/32ba91fa-9395-4dae-8bf6-384541b2d3ed-must-gather-output\") pod \"must-gather-kz6gt\" (UID: \"32ba91fa-9395-4dae-8bf6-384541b2d3ed\") " pod="openshift-must-gather-ltcrl/must-gather-kz6gt" Jan 21 15:01:29 crc kubenswrapper[4720]: I0121 15:01:29.957354 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvtcx\" (UniqueName: \"kubernetes.io/projected/32ba91fa-9395-4dae-8bf6-384541b2d3ed-kube-api-access-mvtcx\") pod \"must-gather-kz6gt\" (UID: \"32ba91fa-9395-4dae-8bf6-384541b2d3ed\") " pod="openshift-must-gather-ltcrl/must-gather-kz6gt" Jan 21 15:01:30 crc kubenswrapper[4720]: I0121 15:01:30.064607 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ltcrl/must-gather-kz6gt" Jan 21 15:01:30 crc kubenswrapper[4720]: I0121 15:01:30.623174 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ltcrl/must-gather-kz6gt"] Jan 21 15:01:30 crc kubenswrapper[4720]: W0121 15:01:30.630714 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32ba91fa_9395_4dae_8bf6_384541b2d3ed.slice/crio-616a1c463942b8844a85522f6f0d3f6da17e8a9b8d20809c560cf5484f4cde3a WatchSource:0}: Error finding container 616a1c463942b8844a85522f6f0d3f6da17e8a9b8d20809c560cf5484f4cde3a: Status 404 returned error can't find the container with id 616a1c463942b8844a85522f6f0d3f6da17e8a9b8d20809c560cf5484f4cde3a Jan 21 15:01:30 crc kubenswrapper[4720]: I0121 15:01:30.705669 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ltcrl/must-gather-kz6gt" event={"ID":"32ba91fa-9395-4dae-8bf6-384541b2d3ed","Type":"ContainerStarted","Data":"616a1c463942b8844a85522f6f0d3f6da17e8a9b8d20809c560cf5484f4cde3a"} Jan 21 15:01:34 crc kubenswrapper[4720]: I0121 15:01:34.245272 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6x2nt"] Jan 21 15:01:34 crc kubenswrapper[4720]: I0121 15:01:34.247915 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6x2nt" Jan 21 15:01:34 crc kubenswrapper[4720]: I0121 15:01:34.280550 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6x2nt"] Jan 21 15:01:34 crc kubenswrapper[4720]: I0121 15:01:34.336860 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5vdk\" (UniqueName: \"kubernetes.io/projected/26e526e0-a293-4e24-a0b3-cc7fa0e9308b-kube-api-access-l5vdk\") pod \"redhat-operators-6x2nt\" (UID: \"26e526e0-a293-4e24-a0b3-cc7fa0e9308b\") " pod="openshift-marketplace/redhat-operators-6x2nt" Jan 21 15:01:34 crc kubenswrapper[4720]: I0121 15:01:34.336929 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26e526e0-a293-4e24-a0b3-cc7fa0e9308b-utilities\") pod \"redhat-operators-6x2nt\" (UID: \"26e526e0-a293-4e24-a0b3-cc7fa0e9308b\") " pod="openshift-marketplace/redhat-operators-6x2nt" Jan 21 15:01:34 crc kubenswrapper[4720]: I0121 15:01:34.337065 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26e526e0-a293-4e24-a0b3-cc7fa0e9308b-catalog-content\") pod \"redhat-operators-6x2nt\" (UID: \"26e526e0-a293-4e24-a0b3-cc7fa0e9308b\") " pod="openshift-marketplace/redhat-operators-6x2nt" Jan 21 15:01:34 crc kubenswrapper[4720]: I0121 15:01:34.438935 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5vdk\" (UniqueName: \"kubernetes.io/projected/26e526e0-a293-4e24-a0b3-cc7fa0e9308b-kube-api-access-l5vdk\") pod \"redhat-operators-6x2nt\" (UID: \"26e526e0-a293-4e24-a0b3-cc7fa0e9308b\") " pod="openshift-marketplace/redhat-operators-6x2nt" Jan 21 15:01:34 crc kubenswrapper[4720]: I0121 15:01:34.439002 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26e526e0-a293-4e24-a0b3-cc7fa0e9308b-utilities\") pod \"redhat-operators-6x2nt\" (UID: \"26e526e0-a293-4e24-a0b3-cc7fa0e9308b\") " pod="openshift-marketplace/redhat-operators-6x2nt" Jan 21 15:01:34 crc kubenswrapper[4720]: I0121 15:01:34.439096 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26e526e0-a293-4e24-a0b3-cc7fa0e9308b-catalog-content\") pod \"redhat-operators-6x2nt\" (UID: \"26e526e0-a293-4e24-a0b3-cc7fa0e9308b\") " pod="openshift-marketplace/redhat-operators-6x2nt" Jan 21 15:01:34 crc kubenswrapper[4720]: I0121 15:01:34.439891 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26e526e0-a293-4e24-a0b3-cc7fa0e9308b-catalog-content\") pod \"redhat-operators-6x2nt\" (UID: \"26e526e0-a293-4e24-a0b3-cc7fa0e9308b\") " pod="openshift-marketplace/redhat-operators-6x2nt" Jan 21 15:01:34 crc kubenswrapper[4720]: I0121 15:01:34.440150 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26e526e0-a293-4e24-a0b3-cc7fa0e9308b-utilities\") pod \"redhat-operators-6x2nt\" (UID: \"26e526e0-a293-4e24-a0b3-cc7fa0e9308b\") " pod="openshift-marketplace/redhat-operators-6x2nt" Jan 21 15:01:34 crc kubenswrapper[4720]: I0121 15:01:34.460047 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5vdk\" (UniqueName: \"kubernetes.io/projected/26e526e0-a293-4e24-a0b3-cc7fa0e9308b-kube-api-access-l5vdk\") pod \"redhat-operators-6x2nt\" (UID: \"26e526e0-a293-4e24-a0b3-cc7fa0e9308b\") " pod="openshift-marketplace/redhat-operators-6x2nt" Jan 21 15:01:34 crc kubenswrapper[4720]: I0121 15:01:34.582801 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6x2nt" Jan 21 15:01:39 crc kubenswrapper[4720]: I0121 15:01:39.274678 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6x2nt"] Jan 21 15:01:39 crc kubenswrapper[4720]: I0121 15:01:39.800558 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ltcrl/must-gather-kz6gt" event={"ID":"32ba91fa-9395-4dae-8bf6-384541b2d3ed","Type":"ContainerStarted","Data":"c33cd97c026b015df83cc6f96e3b1b70f009b429e334e4945f3e1e3052d31932"} Jan 21 15:01:39 crc kubenswrapper[4720]: I0121 15:01:39.800604 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ltcrl/must-gather-kz6gt" event={"ID":"32ba91fa-9395-4dae-8bf6-384541b2d3ed","Type":"ContainerStarted","Data":"1c87134fcc0ac7700d916ae3b483f047e49613bd0b9fd19a14ad4f58b8e5db77"} Jan 21 15:01:39 crc kubenswrapper[4720]: I0121 15:01:39.803289 4720 generic.go:334] "Generic (PLEG): container finished" podID="26e526e0-a293-4e24-a0b3-cc7fa0e9308b" containerID="a4664b70456dd4409bc474ee18c3c107d7f7a37fd01804bf2b091e5aafe215e7" exitCode=0 Jan 21 15:01:39 crc kubenswrapper[4720]: I0121 15:01:39.803331 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6x2nt" event={"ID":"26e526e0-a293-4e24-a0b3-cc7fa0e9308b","Type":"ContainerDied","Data":"a4664b70456dd4409bc474ee18c3c107d7f7a37fd01804bf2b091e5aafe215e7"} Jan 21 15:01:39 crc kubenswrapper[4720]: I0121 15:01:39.803353 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6x2nt" event={"ID":"26e526e0-a293-4e24-a0b3-cc7fa0e9308b","Type":"ContainerStarted","Data":"50874e76ea4107b7c07d6f1ccee98b03d59ab44fcfb2f73925bdd79450642dc9"} Jan 21 15:01:39 crc kubenswrapper[4720]: I0121 15:01:39.833678 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ltcrl/must-gather-kz6gt" podStartSLOduration=2.596554255 podStartE2EDuration="10.833646403s" podCreationTimestamp="2026-01-21 15:01:29 +0000 UTC" firstStartedPulling="2026-01-21 15:01:30.634254829 +0000 UTC m=+1928.542994761" lastFinishedPulling="2026-01-21 15:01:38.871346977 +0000 UTC m=+1936.780086909" observedRunningTime="2026-01-21 15:01:39.830067114 +0000 UTC m=+1937.738807056" watchObservedRunningTime="2026-01-21 15:01:39.833646403 +0000 UTC m=+1937.742386335" Jan 21 15:01:40 crc kubenswrapper[4720]: I0121 15:01:40.678705 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 15:01:40 crc kubenswrapper[4720]: E0121 15:01:40.679897 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:01:41 crc kubenswrapper[4720]: I0121 15:01:41.824849 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6x2nt" event={"ID":"26e526e0-a293-4e24-a0b3-cc7fa0e9308b","Type":"ContainerStarted","Data":"6bb1fadb296f4f7a4e177b677ea368708667b26d03ffd5e4381a0ce2b42c380d"} Jan 21 15:01:42 crc kubenswrapper[4720]: E0121 15:01:42.520583 4720 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.103:60486->38.102.83.103:43429: read tcp 38.102.83.103:60486->38.102.83.103:43429: read: connection reset by peer Jan 21 15:01:43 crc kubenswrapper[4720]: I0121 15:01:43.212554 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ltcrl/crc-debug-fblvn"] Jan 21 15:01:43 crc kubenswrapper[4720]: I0121 15:01:43.214052 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ltcrl/crc-debug-fblvn" Jan 21 15:01:43 crc kubenswrapper[4720]: I0121 15:01:43.215863 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-ltcrl"/"default-dockercfg-bztpl" Jan 21 15:01:43 crc kubenswrapper[4720]: I0121 15:01:43.357632 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6pkx\" (UniqueName: \"kubernetes.io/projected/4b489c8d-aa41-41cf-a984-9479eda75544-kube-api-access-m6pkx\") pod \"crc-debug-fblvn\" (UID: \"4b489c8d-aa41-41cf-a984-9479eda75544\") " pod="openshift-must-gather-ltcrl/crc-debug-fblvn" Jan 21 15:01:43 crc kubenswrapper[4720]: I0121 15:01:43.357768 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4b489c8d-aa41-41cf-a984-9479eda75544-host\") pod \"crc-debug-fblvn\" (UID: \"4b489c8d-aa41-41cf-a984-9479eda75544\") " pod="openshift-must-gather-ltcrl/crc-debug-fblvn" Jan 21 15:01:43 crc kubenswrapper[4720]: I0121 15:01:43.459851 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6pkx\" (UniqueName: \"kubernetes.io/projected/4b489c8d-aa41-41cf-a984-9479eda75544-kube-api-access-m6pkx\") pod \"crc-debug-fblvn\" (UID: \"4b489c8d-aa41-41cf-a984-9479eda75544\") " pod="openshift-must-gather-ltcrl/crc-debug-fblvn" Jan 21 15:01:43 crc kubenswrapper[4720]: I0121 15:01:43.459932 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4b489c8d-aa41-41cf-a984-9479eda75544-host\") pod \"crc-debug-fblvn\" (UID: \"4b489c8d-aa41-41cf-a984-9479eda75544\") " pod="openshift-must-gather-ltcrl/crc-debug-fblvn" Jan 21 15:01:43 crc kubenswrapper[4720]: I0121 15:01:43.460089 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4b489c8d-aa41-41cf-a984-9479eda75544-host\") pod \"crc-debug-fblvn\" (UID: \"4b489c8d-aa41-41cf-a984-9479eda75544\") " pod="openshift-must-gather-ltcrl/crc-debug-fblvn" Jan 21 15:01:43 crc kubenswrapper[4720]: I0121 15:01:43.480250 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6pkx\" (UniqueName: \"kubernetes.io/projected/4b489c8d-aa41-41cf-a984-9479eda75544-kube-api-access-m6pkx\") pod \"crc-debug-fblvn\" (UID: \"4b489c8d-aa41-41cf-a984-9479eda75544\") " pod="openshift-must-gather-ltcrl/crc-debug-fblvn" Jan 21 15:01:43 crc kubenswrapper[4720]: I0121 15:01:43.533608 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ltcrl/crc-debug-fblvn" Jan 21 15:01:43 crc kubenswrapper[4720]: W0121 15:01:43.563823 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b489c8d_aa41_41cf_a984_9479eda75544.slice/crio-51adffb4e5148a9253b0ff230c11ab3a42ce9ad43dec9586d4d824fd37aed830 WatchSource:0}: Error finding container 51adffb4e5148a9253b0ff230c11ab3a42ce9ad43dec9586d4d824fd37aed830: Status 404 returned error can't find the container with id 51adffb4e5148a9253b0ff230c11ab3a42ce9ad43dec9586d4d824fd37aed830 Jan 21 15:01:43 crc kubenswrapper[4720]: I0121 15:01:43.566419 4720 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 15:01:43 crc kubenswrapper[4720]: I0121 15:01:43.841000 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ltcrl/crc-debug-fblvn" event={"ID":"4b489c8d-aa41-41cf-a984-9479eda75544","Type":"ContainerStarted","Data":"51adffb4e5148a9253b0ff230c11ab3a42ce9ad43dec9586d4d824fd37aed830"} Jan 21 15:01:46 crc kubenswrapper[4720]: I0121 15:01:46.865228 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5f448c69d6-sjp2r_3b177763-3020-4854-b45a-43d99221c670/barbican-api-log/0.log" Jan 21 15:01:46 crc kubenswrapper[4720]: I0121 15:01:46.870222 4720 generic.go:334] "Generic (PLEG): container finished" podID="26e526e0-a293-4e24-a0b3-cc7fa0e9308b" containerID="6bb1fadb296f4f7a4e177b677ea368708667b26d03ffd5e4381a0ce2b42c380d" exitCode=0 Jan 21 15:01:46 crc kubenswrapper[4720]: I0121 15:01:46.870261 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6x2nt" event={"ID":"26e526e0-a293-4e24-a0b3-cc7fa0e9308b","Type":"ContainerDied","Data":"6bb1fadb296f4f7a4e177b677ea368708667b26d03ffd5e4381a0ce2b42c380d"} Jan 21 15:01:46 crc kubenswrapper[4720]: I0121 15:01:46.884414 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5f448c69d6-sjp2r_3b177763-3020-4854-b45a-43d99221c670/barbican-api/0.log" Jan 21 15:01:46 crc kubenswrapper[4720]: I0121 15:01:46.961157 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6898c4b994-dn9qn_bb475766-6891-454b-8f7e-1494d9806891/barbican-keystone-listener-log/0.log" Jan 21 15:01:46 crc kubenswrapper[4720]: I0121 15:01:46.968419 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6898c4b994-dn9qn_bb475766-6891-454b-8f7e-1494d9806891/barbican-keystone-listener/0.log" Jan 21 15:01:46 crc kubenswrapper[4720]: I0121 15:01:46.979860 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-8f88c9d47-m5rzn_9355d502-bf01-4465-996d-483d99b92954/barbican-worker-log/0.log" Jan 21 15:01:46 crc kubenswrapper[4720]: I0121 15:01:46.989338 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-8f88c9d47-m5rzn_9355d502-bf01-4465-996d-483d99b92954/barbican-worker/0.log" Jan 21 15:01:47 crc kubenswrapper[4720]: I0121 15:01:47.056711 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6_b96fb314-d163-41a0-b2b0-9a9c117d504c/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 15:01:47 crc kubenswrapper[4720]: I0121 15:01:47.101413 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe/ceilometer-central-agent/0.log" Jan 21 15:01:47 crc kubenswrapper[4720]: I0121 15:01:47.125667 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe/ceilometer-notification-agent/0.log" Jan 21 15:01:47 crc kubenswrapper[4720]: I0121 15:01:47.133134 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe/sg-core/0.log" Jan 21 15:01:47 crc kubenswrapper[4720]: I0121 15:01:47.137583 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe/proxy-httpd/0.log" Jan 21 15:01:47 crc kubenswrapper[4720]: I0121 15:01:47.149189 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9_09ee2ae5-f10a-4080-90df-29c01525e871/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 15:01:47 crc kubenswrapper[4720]: I0121 15:01:47.160362 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_e4281fdf-eb56-41e8-a750-13ee7ac37bea/cinder-api-log/0.log" Jan 21 15:01:47 crc kubenswrapper[4720]: I0121 15:01:47.199027 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_e4281fdf-eb56-41e8-a750-13ee7ac37bea/cinder-api/0.log" Jan 21 15:01:47 crc kubenswrapper[4720]: I0121 15:01:47.257344 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_0896fa5e-6919-42bf-9e61-cf73218e9edf/cinder-scheduler/0.log" Jan 21 15:01:47 crc kubenswrapper[4720]: I0121 15:01:47.278842 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_0896fa5e-6919-42bf-9e61-cf73218e9edf/probe/0.log" Jan 21 15:01:47 crc kubenswrapper[4720]: I0121 15:01:47.316291 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-xdsct_7e4bbdff-6382-41c7-a054-bb15c6923e32/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 15:01:47 crc kubenswrapper[4720]: I0121 15:01:47.341848 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq_5e910d6d-e1c9-447a-9584-0338f9151f26/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 15:01:47 crc kubenswrapper[4720]: I0121 15:01:47.376859 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6c5d8cf46f-bgxfr_248ea464-73a3-4083-bb27-fc2cb7347224/dnsmasq-dns/0.log" Jan 21 15:01:47 crc kubenswrapper[4720]: I0121 15:01:47.386549 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6c5d8cf46f-bgxfr_248ea464-73a3-4083-bb27-fc2cb7347224/init/0.log" Jan 21 15:01:47 crc kubenswrapper[4720]: I0121 15:01:47.438885 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-rwbjg_5c493941-48f3-4a3e-a66a-4f045487005e/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 15:01:47 crc kubenswrapper[4720]: I0121 15:01:47.491038 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-69cc8766db-gdch7_0edd5078-75bc-4823-b52f-ad5effeace06/keystone-api/0.log" Jan 21 15:01:47 crc kubenswrapper[4720]: I0121 15:01:47.498970 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29483461-qxsqs_4d585381-d477-4c8d-af17-6194044b6de1/keystone-cron/0.log" Jan 21 15:01:47 crc kubenswrapper[4720]: I0121 15:01:47.512077 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_60d4c6e3-4a01-421e-aad1-1972ed16e528/kube-state-metrics/0.log" Jan 21 15:01:47 crc kubenswrapper[4720]: I0121 15:01:47.883588 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6x2nt" event={"ID":"26e526e0-a293-4e24-a0b3-cc7fa0e9308b","Type":"ContainerStarted","Data":"ca3be57ac007ed5872cbc69b80c1e3b190a6a88999004e016e12d63ec17af851"} Jan 21 15:01:47 crc kubenswrapper[4720]: I0121 15:01:47.910834 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6x2nt" podStartSLOduration=6.435078757 podStartE2EDuration="13.910815182s" podCreationTimestamp="2026-01-21 15:01:34 +0000 UTC" firstStartedPulling="2026-01-21 15:01:39.804827385 +0000 UTC m=+1937.713567317" lastFinishedPulling="2026-01-21 15:01:47.2805638 +0000 UTC m=+1945.189303742" observedRunningTime="2026-01-21 15:01:47.903032409 +0000 UTC m=+1945.811772341" watchObservedRunningTime="2026-01-21 15:01:47.910815182 +0000 UTC m=+1945.819555104" Jan 21 15:01:50 crc kubenswrapper[4720]: I0121 15:01:50.994458 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sxdtl"] Jan 21 15:01:50 crc kubenswrapper[4720]: I0121 15:01:50.996738 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sxdtl" Jan 21 15:01:51 crc kubenswrapper[4720]: I0121 15:01:51.011447 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sxdtl"] Jan 21 15:01:51 crc kubenswrapper[4720]: I0121 15:01:51.111465 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmktm\" (UniqueName: \"kubernetes.io/projected/4f558038-e16a-4aa1-bb7b-ddb6f14987a7-kube-api-access-lmktm\") pod \"redhat-marketplace-sxdtl\" (UID: \"4f558038-e16a-4aa1-bb7b-ddb6f14987a7\") " pod="openshift-marketplace/redhat-marketplace-sxdtl" Jan 21 15:01:51 crc kubenswrapper[4720]: I0121 15:01:51.111606 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f558038-e16a-4aa1-bb7b-ddb6f14987a7-utilities\") pod \"redhat-marketplace-sxdtl\" (UID: \"4f558038-e16a-4aa1-bb7b-ddb6f14987a7\") " pod="openshift-marketplace/redhat-marketplace-sxdtl" Jan 21 15:01:51 crc kubenswrapper[4720]: I0121 15:01:51.111628 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f558038-e16a-4aa1-bb7b-ddb6f14987a7-catalog-content\") pod \"redhat-marketplace-sxdtl\" (UID: \"4f558038-e16a-4aa1-bb7b-ddb6f14987a7\") " pod="openshift-marketplace/redhat-marketplace-sxdtl" Jan 21 15:01:51 crc kubenswrapper[4720]: I0121 15:01:51.213829 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmktm\" (UniqueName: \"kubernetes.io/projected/4f558038-e16a-4aa1-bb7b-ddb6f14987a7-kube-api-access-lmktm\") pod \"redhat-marketplace-sxdtl\" (UID: \"4f558038-e16a-4aa1-bb7b-ddb6f14987a7\") " pod="openshift-marketplace/redhat-marketplace-sxdtl" Jan 21 15:01:51 crc kubenswrapper[4720]: I0121 15:01:51.213939 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f558038-e16a-4aa1-bb7b-ddb6f14987a7-utilities\") pod \"redhat-marketplace-sxdtl\" (UID: \"4f558038-e16a-4aa1-bb7b-ddb6f14987a7\") " pod="openshift-marketplace/redhat-marketplace-sxdtl" Jan 21 15:01:51 crc kubenswrapper[4720]: I0121 15:01:51.214029 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f558038-e16a-4aa1-bb7b-ddb6f14987a7-catalog-content\") pod \"redhat-marketplace-sxdtl\" (UID: \"4f558038-e16a-4aa1-bb7b-ddb6f14987a7\") " pod="openshift-marketplace/redhat-marketplace-sxdtl" Jan 21 15:01:51 crc kubenswrapper[4720]: I0121 15:01:51.217204 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f558038-e16a-4aa1-bb7b-ddb6f14987a7-catalog-content\") pod \"redhat-marketplace-sxdtl\" (UID: \"4f558038-e16a-4aa1-bb7b-ddb6f14987a7\") " pod="openshift-marketplace/redhat-marketplace-sxdtl" Jan 21 15:01:51 crc kubenswrapper[4720]: I0121 15:01:51.217225 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f558038-e16a-4aa1-bb7b-ddb6f14987a7-utilities\") pod \"redhat-marketplace-sxdtl\" (UID: \"4f558038-e16a-4aa1-bb7b-ddb6f14987a7\") " pod="openshift-marketplace/redhat-marketplace-sxdtl" Jan 21 15:01:51 crc kubenswrapper[4720]: I0121 15:01:51.232913 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmktm\" (UniqueName: \"kubernetes.io/projected/4f558038-e16a-4aa1-bb7b-ddb6f14987a7-kube-api-access-lmktm\") pod \"redhat-marketplace-sxdtl\" (UID: \"4f558038-e16a-4aa1-bb7b-ddb6f14987a7\") " pod="openshift-marketplace/redhat-marketplace-sxdtl" Jan 21 15:01:51 crc kubenswrapper[4720]: I0121 15:01:51.321793 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sxdtl" Jan 21 15:01:54 crc kubenswrapper[4720]: I0121 15:01:54.583855 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6x2nt" Jan 21 15:01:54 crc kubenswrapper[4720]: I0121 15:01:54.584194 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6x2nt" Jan 21 15:01:55 crc kubenswrapper[4720]: I0121 15:01:55.634441 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6x2nt" podUID="26e526e0-a293-4e24-a0b3-cc7fa0e9308b" containerName="registry-server" probeResult="failure" output=< Jan 21 15:01:55 crc kubenswrapper[4720]: timeout: failed to connect service ":50051" within 1s Jan 21 15:01:55 crc kubenswrapper[4720]: > Jan 21 15:01:55 crc kubenswrapper[4720]: I0121 15:01:55.678080 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 15:01:55 crc kubenswrapper[4720]: E0121 15:01:55.678462 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:01:55 crc kubenswrapper[4720]: I0121 15:01:55.982999 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_73c29d26-d7a2-40b5-81b8-ffda85c198d3/memcached/0.log" Jan 21 15:01:56 crc kubenswrapper[4720]: I0121 15:01:56.015171 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6c8b4f85f7-4kz9x_7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7/neutron-api/0.log" Jan 21 15:01:56 crc kubenswrapper[4720]: I0121 15:01:56.029007 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6c8b4f85f7-4kz9x_7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7/neutron-httpd/0.log" Jan 21 15:01:56 crc kubenswrapper[4720]: I0121 15:01:56.128062 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_33c62270-7ab4-416b-bf5f-e0007f477733/nova-api-log/0.log" Jan 21 15:01:56 crc kubenswrapper[4720]: I0121 15:01:56.229885 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_33c62270-7ab4-416b-bf5f-e0007f477733/nova-api-api/0.log" Jan 21 15:01:56 crc kubenswrapper[4720]: I0121 15:01:56.355721 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_496cefe3-f97b-4d8c-9a25-4a6533d9e64c/nova-cell0-conductor-conductor/0.log" Jan 21 15:01:56 crc kubenswrapper[4720]: I0121 15:01:56.368775 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sxdtl"] Jan 21 15:01:56 crc kubenswrapper[4720]: I0121 15:01:56.454444 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_679bb64e-c157-415f-9214-0f4e62001f03/nova-cell1-conductor-conductor/0.log" Jan 21 15:01:56 crc kubenswrapper[4720]: I0121 15:01:56.505816 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_5ea3e3dd-0e39-4a28-9112-27f0874af221/nova-cell1-novncproxy-novncproxy/0.log" Jan 21 15:01:56 crc kubenswrapper[4720]: I0121 15:01:56.579424 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_7177980c-4db3-4902-aac2-c0825b778b2a/nova-metadata-log/0.log" Jan 21 15:01:56 crc kubenswrapper[4720]: I0121 15:01:56.872974 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_7177980c-4db3-4902-aac2-c0825b778b2a/nova-metadata-metadata/0.log" Jan 21 15:01:56 crc kubenswrapper[4720]: I0121 15:01:56.961730 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_039c7115-f471-47ad-a7c4-75b1d7a40a94/nova-scheduler-scheduler/0.log" Jan 21 15:01:56 crc kubenswrapper[4720]: I0121 15:01:56.986432 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_8a6a2220-24c4-4a0b-b72e-848dbac6a14b/galera/0.log" Jan 21 15:01:56 crc kubenswrapper[4720]: I0121 15:01:56.993968 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ltcrl/crc-debug-fblvn" event={"ID":"4b489c8d-aa41-41cf-a984-9479eda75544","Type":"ContainerStarted","Data":"a41cd8196de8dca42371cf925db6d045de3d0cbd2f7f8353d4af3ee985a4735d"} Jan 21 15:01:56 crc kubenswrapper[4720]: I0121 15:01:56.996452 4720 generic.go:334] "Generic (PLEG): container finished" podID="4f558038-e16a-4aa1-bb7b-ddb6f14987a7" containerID="fdc8c3f9dedb691d525e6b32624691f58b5d6a0081fa1325fe2862476431a992" exitCode=0 Jan 21 15:01:56 crc kubenswrapper[4720]: I0121 15:01:56.996478 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxdtl" event={"ID":"4f558038-e16a-4aa1-bb7b-ddb6f14987a7","Type":"ContainerDied","Data":"fdc8c3f9dedb691d525e6b32624691f58b5d6a0081fa1325fe2862476431a992"} Jan 21 15:01:56 crc kubenswrapper[4720]: I0121 15:01:56.996492 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxdtl" event={"ID":"4f558038-e16a-4aa1-bb7b-ddb6f14987a7","Type":"ContainerStarted","Data":"ec6aaee5e30b27a55c2206c76a5a2e84eaba6a236f2d843ee4cfb96336a189b9"} Jan 21 15:01:56 crc kubenswrapper[4720]: I0121 15:01:56.998645 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_8a6a2220-24c4-4a0b-b72e-848dbac6a14b/mysql-bootstrap/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.031621 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ab11441b-6bc4-4883-8a1e-866b31b425e9/galera/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.048432 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ab11441b-6bc4-4883-8a1e-866b31b425e9/mysql-bootstrap/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.048482 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ltcrl/crc-debug-fblvn" podStartSLOduration=1.7364206210000002 podStartE2EDuration="14.048470768s" podCreationTimestamp="2026-01-21 15:01:43 +0000 UTC" firstStartedPulling="2026-01-21 15:01:43.566104382 +0000 UTC m=+1941.474844314" lastFinishedPulling="2026-01-21 15:01:55.878154529 +0000 UTC m=+1953.786894461" observedRunningTime="2026-01-21 15:01:57.013256156 +0000 UTC m=+1954.921996088" watchObservedRunningTime="2026-01-21 15:01:57.048470768 +0000 UTC m=+1954.957210700" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.057465 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6/openstackclient/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.089262 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-h55pf_4fc0e40b-c337-42d2-87a3-2eedfa2f1a65/openstack-network-exporter/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.106324 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2v7f2_04da7387-73aa-43e0-b547-7ce56e71d865/ovsdb-server/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.114497 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2v7f2_04da7387-73aa-43e0-b547-7ce56e71d865/ovs-vswitchd/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.122542 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2v7f2_04da7387-73aa-43e0-b547-7ce56e71d865/ovsdb-server-init/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.133369 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-wpvzs_95379233-3cd8-4dd3-bf0f-b8198f2258e1/ovn-controller/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.146331 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_262f8354-3f7b-483f-940d-8b0f394e344a/ovn-northd/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.153579 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_262f8354-3f7b-483f-940d-8b0f394e344a/openstack-network-exporter/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.172379 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e8cf4740-b779-4759-92d1-22ce3e5f1369/ovsdbserver-nb/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.180433 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e8cf4740-b779-4759-92d1-22ce3e5f1369/openstack-network-exporter/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.195429 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4b833ac6-f279-4dfb-84fb-22b531e6b7ef/ovsdbserver-sb/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.201016 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4b833ac6-f279-4dfb-84fb-22b531e6b7ef/openstack-network-exporter/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.228077 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-8648996d7d-4f2q4_37e9aac3-9710-4d1c-88a7-1a0a22b5a593/placement-log/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.242540 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-8648996d7d-4f2q4_37e9aac3-9710-4d1c-88a7-1a0a22b5a593/placement-api/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.258895 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4906b5ed-c663-4e81-ab33-2b8f33777cd1/rabbitmq/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.267498 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4906b5ed-c663-4e81-ab33-2b8f33777cd1/setup-container/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.286366 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f73dd82b-9ad1-4deb-b244-6d42a3f25f89/rabbitmq/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.292572 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f73dd82b-9ad1-4deb-b244-6d42a3f25f89/setup-container/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.310419 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr_64e0dfca-6b74-47c9-8f6f-76de697cf3e0/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.321801 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt_0506243d-6216-4541-8f14-8b2c2beb409b/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.332758 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-bzgnr_595ce90e-f537-4d7f-be8f-a4da40103ab1/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.360944 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-4ngb6_d64c2129-c3c8-4f00-ac2f-750094e2ea79/ssh-known-hosts-edpm-deployment/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.376360 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd_1708e39a-582c-42e2-8c2e-d71fef75a183/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 15:01:59 crc kubenswrapper[4720]: I0121 15:01:59.016989 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxdtl" event={"ID":"4f558038-e16a-4aa1-bb7b-ddb6f14987a7","Type":"ContainerStarted","Data":"a1023d7679500ef93dc8f6651cd620c7b2b00e6d045eb90b1e889e744e4df3ed"} Jan 21 15:02:00 crc kubenswrapper[4720]: I0121 15:02:00.044807 4720 scope.go:117] "RemoveContainer" containerID="08798f35f080deb2759dc17480e0acb520080e74f20bec131db2674bbfdecfac" Jan 21 15:02:01 crc kubenswrapper[4720]: I0121 15:02:01.034801 4720 generic.go:334] "Generic (PLEG): container finished" podID="4f558038-e16a-4aa1-bb7b-ddb6f14987a7" containerID="a1023d7679500ef93dc8f6651cd620c7b2b00e6d045eb90b1e889e744e4df3ed" exitCode=0 Jan 21 15:02:01 crc kubenswrapper[4720]: I0121 15:02:01.034942 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxdtl" event={"ID":"4f558038-e16a-4aa1-bb7b-ddb6f14987a7","Type":"ContainerDied","Data":"a1023d7679500ef93dc8f6651cd620c7b2b00e6d045eb90b1e889e744e4df3ed"} Jan 21 15:02:04 crc kubenswrapper[4720]: I0121 15:02:04.076684 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxdtl" event={"ID":"4f558038-e16a-4aa1-bb7b-ddb6f14987a7","Type":"ContainerStarted","Data":"7bd6838694b592aaa035d463ea22b7f37cf62840e364dfb62000218aab28c0df"} Jan 21 15:02:04 crc kubenswrapper[4720]: I0121 15:02:04.100987 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sxdtl" podStartSLOduration=7.461914655 podStartE2EDuration="14.100969927s" podCreationTimestamp="2026-01-21 15:01:50 +0000 UTC" firstStartedPulling="2026-01-21 15:01:56.998237375 +0000 UTC m=+1954.906977307" lastFinishedPulling="2026-01-21 15:02:03.637292647 +0000 UTC m=+1961.546032579" observedRunningTime="2026-01-21 15:02:04.100951717 +0000 UTC m=+1962.009691659" watchObservedRunningTime="2026-01-21 15:02:04.100969927 +0000 UTC m=+1962.009709859" Jan 21 15:02:05 crc kubenswrapper[4720]: I0121 15:02:05.678160 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6x2nt" podUID="26e526e0-a293-4e24-a0b3-cc7fa0e9308b" containerName="registry-server" probeResult="failure" output=< Jan 21 15:02:05 crc kubenswrapper[4720]: timeout: failed to connect service ":50051" within 1s Jan 21 15:02:05 crc kubenswrapper[4720]: > Jan 21 15:02:08 crc kubenswrapper[4720]: I0121 15:02:08.390280 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-72sfn_51379103-8c08-45c6-a0f3-86928d43bd50/controller/0.log" Jan 21 15:02:08 crc kubenswrapper[4720]: I0121 15:02:08.396707 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-72sfn_51379103-8c08-45c6-a0f3-86928d43bd50/kube-rbac-proxy/0.log" Jan 21 15:02:08 crc kubenswrapper[4720]: I0121 15:02:08.411822 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/controller/0.log" Jan 21 15:02:09 crc kubenswrapper[4720]: I0121 15:02:09.409443 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/frr/0.log" Jan 21 15:02:09 crc kubenswrapper[4720]: I0121 15:02:09.419126 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/reloader/0.log" Jan 21 15:02:09 crc kubenswrapper[4720]: I0121 15:02:09.429216 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/frr-metrics/0.log" Jan 21 15:02:09 crc kubenswrapper[4720]: I0121 15:02:09.441013 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/kube-rbac-proxy/0.log" Jan 21 15:02:09 crc kubenswrapper[4720]: I0121 15:02:09.449752 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/kube-rbac-proxy-frr/0.log" Jan 21 15:02:09 crc kubenswrapper[4720]: I0121 15:02:09.457981 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/cp-frr-files/0.log" Jan 21 15:02:09 crc kubenswrapper[4720]: I0121 15:02:09.467846 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/cp-reloader/0.log" Jan 21 15:02:09 crc kubenswrapper[4720]: I0121 15:02:09.477914 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/cp-metrics/0.log" Jan 21 15:02:09 crc kubenswrapper[4720]: I0121 15:02:09.494344 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-lsrs9_8ba45f1e-4559-4408-b129-b061d406fce6/frr-k8s-webhook-server/0.log" Jan 21 15:02:09 crc kubenswrapper[4720]: I0121 15:02:09.520930 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7b8c8cff46-cbv67_b6fdd799-fe82-4cd7-b825-c755b6189180/manager/0.log" Jan 21 15:02:09 crc kubenswrapper[4720]: I0121 15:02:09.542212 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-75df998c5f-tnbdz_6c334ce5-b6c7-40c8-a261-5a5084ae3db8/webhook-server/0.log" Jan 21 15:02:09 crc kubenswrapper[4720]: I0121 15:02:09.817313 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-m7fv6_49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1/speaker/0.log" Jan 21 15:02:09 crc kubenswrapper[4720]: I0121 15:02:09.829766 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-m7fv6_49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1/kube-rbac-proxy/0.log" Jan 21 15:02:10 crc kubenswrapper[4720]: I0121 15:02:10.678495 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 15:02:10 crc kubenswrapper[4720]: E0121 15:02:10.678921 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:02:11 crc kubenswrapper[4720]: I0121 15:02:11.322981 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sxdtl" Jan 21 15:02:11 crc kubenswrapper[4720]: I0121 15:02:11.323378 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sxdtl" Jan 21 15:02:11 crc kubenswrapper[4720]: I0121 15:02:11.385926 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sxdtl" Jan 21 15:02:12 crc kubenswrapper[4720]: I0121 15:02:12.134440 4720 generic.go:334] "Generic (PLEG): container finished" podID="4b489c8d-aa41-41cf-a984-9479eda75544" containerID="a41cd8196de8dca42371cf925db6d045de3d0cbd2f7f8353d4af3ee985a4735d" exitCode=0 Jan 21 15:02:12 crc kubenswrapper[4720]: I0121 15:02:12.134530 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ltcrl/crc-debug-fblvn" event={"ID":"4b489c8d-aa41-41cf-a984-9479eda75544","Type":"ContainerDied","Data":"a41cd8196de8dca42371cf925db6d045de3d0cbd2f7f8353d4af3ee985a4735d"} Jan 21 15:02:12 crc kubenswrapper[4720]: I0121 15:02:12.180033 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sxdtl" Jan 21 15:02:12 crc kubenswrapper[4720]: I0121 15:02:12.230284 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sxdtl"] Jan 21 15:02:13 crc kubenswrapper[4720]: I0121 15:02:13.271324 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ltcrl/crc-debug-fblvn" Jan 21 15:02:13 crc kubenswrapper[4720]: I0121 15:02:13.304993 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ltcrl/crc-debug-fblvn"] Jan 21 15:02:13 crc kubenswrapper[4720]: I0121 15:02:13.311661 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ltcrl/crc-debug-fblvn"] Jan 21 15:02:13 crc kubenswrapper[4720]: I0121 15:02:13.340517 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4b489c8d-aa41-41cf-a984-9479eda75544-host\") pod \"4b489c8d-aa41-41cf-a984-9479eda75544\" (UID: \"4b489c8d-aa41-41cf-a984-9479eda75544\") " Jan 21 15:02:13 crc kubenswrapper[4720]: I0121 15:02:13.340570 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6pkx\" (UniqueName: \"kubernetes.io/projected/4b489c8d-aa41-41cf-a984-9479eda75544-kube-api-access-m6pkx\") pod \"4b489c8d-aa41-41cf-a984-9479eda75544\" (UID: \"4b489c8d-aa41-41cf-a984-9479eda75544\") " Jan 21 15:02:13 crc kubenswrapper[4720]: I0121 15:02:13.340832 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b489c8d-aa41-41cf-a984-9479eda75544-host" (OuterVolumeSpecName: "host") pod "4b489c8d-aa41-41cf-a984-9479eda75544" (UID: "4b489c8d-aa41-41cf-a984-9479eda75544"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:02:13 crc kubenswrapper[4720]: I0121 15:02:13.341892 4720 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4b489c8d-aa41-41cf-a984-9479eda75544-host\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:13 crc kubenswrapper[4720]: I0121 15:02:13.349970 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b489c8d-aa41-41cf-a984-9479eda75544-kube-api-access-m6pkx" (OuterVolumeSpecName: "kube-api-access-m6pkx") pod "4b489c8d-aa41-41cf-a984-9479eda75544" (UID: "4b489c8d-aa41-41cf-a984-9479eda75544"). InnerVolumeSpecName "kube-api-access-m6pkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:13 crc kubenswrapper[4720]: I0121 15:02:13.443983 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6pkx\" (UniqueName: \"kubernetes.io/projected/4b489c8d-aa41-41cf-a984-9479eda75544-kube-api-access-m6pkx\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.152643 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ltcrl/crc-debug-fblvn" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.152646 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51adffb4e5148a9253b0ff230c11ab3a42ce9ad43dec9586d4d824fd37aed830" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.152953 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sxdtl" podUID="4f558038-e16a-4aa1-bb7b-ddb6f14987a7" containerName="registry-server" containerID="cri-o://7bd6838694b592aaa035d463ea22b7f37cf62840e364dfb62000218aab28c0df" gracePeriod=2 Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.564075 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ltcrl/crc-debug-tsfvh"] Jan 21 15:02:14 crc kubenswrapper[4720]: E0121 15:02:14.564775 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b489c8d-aa41-41cf-a984-9479eda75544" containerName="container-00" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.564791 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b489c8d-aa41-41cf-a984-9479eda75544" containerName="container-00" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.565039 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b489c8d-aa41-41cf-a984-9479eda75544" containerName="container-00" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.565782 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ltcrl/crc-debug-tsfvh" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.567905 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-ltcrl"/"default-dockercfg-bztpl" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.603011 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sxdtl" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.654535 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6x2nt" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.663451 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f558038-e16a-4aa1-bb7b-ddb6f14987a7-catalog-content\") pod \"4f558038-e16a-4aa1-bb7b-ddb6f14987a7\" (UID: \"4f558038-e16a-4aa1-bb7b-ddb6f14987a7\") " Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.663489 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmktm\" (UniqueName: \"kubernetes.io/projected/4f558038-e16a-4aa1-bb7b-ddb6f14987a7-kube-api-access-lmktm\") pod \"4f558038-e16a-4aa1-bb7b-ddb6f14987a7\" (UID: \"4f558038-e16a-4aa1-bb7b-ddb6f14987a7\") " Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.663759 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f558038-e16a-4aa1-bb7b-ddb6f14987a7-utilities\") pod \"4f558038-e16a-4aa1-bb7b-ddb6f14987a7\" (UID: \"4f558038-e16a-4aa1-bb7b-ddb6f14987a7\") " Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.663987 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df099413-bd8b-4037-89d4-60155f99f19e-host\") pod \"crc-debug-tsfvh\" (UID: \"df099413-bd8b-4037-89d4-60155f99f19e\") " pod="openshift-must-gather-ltcrl/crc-debug-tsfvh" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.664051 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dfnv\" (UniqueName: \"kubernetes.io/projected/df099413-bd8b-4037-89d4-60155f99f19e-kube-api-access-9dfnv\") pod \"crc-debug-tsfvh\" (UID: \"df099413-bd8b-4037-89d4-60155f99f19e\") " pod="openshift-must-gather-ltcrl/crc-debug-tsfvh" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.665109 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f558038-e16a-4aa1-bb7b-ddb6f14987a7-utilities" (OuterVolumeSpecName: "utilities") pod "4f558038-e16a-4aa1-bb7b-ddb6f14987a7" (UID: "4f558038-e16a-4aa1-bb7b-ddb6f14987a7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.684036 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f558038-e16a-4aa1-bb7b-ddb6f14987a7-kube-api-access-lmktm" (OuterVolumeSpecName: "kube-api-access-lmktm") pod "4f558038-e16a-4aa1-bb7b-ddb6f14987a7" (UID: "4f558038-e16a-4aa1-bb7b-ddb6f14987a7"). InnerVolumeSpecName "kube-api-access-lmktm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.698845 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b489c8d-aa41-41cf-a984-9479eda75544" path="/var/lib/kubelet/pods/4b489c8d-aa41-41cf-a984-9479eda75544/volumes" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.700469 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f558038-e16a-4aa1-bb7b-ddb6f14987a7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4f558038-e16a-4aa1-bb7b-ddb6f14987a7" (UID: "4f558038-e16a-4aa1-bb7b-ddb6f14987a7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.723237 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6x2nt" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.765576 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df099413-bd8b-4037-89d4-60155f99f19e-host\") pod \"crc-debug-tsfvh\" (UID: \"df099413-bd8b-4037-89d4-60155f99f19e\") " pod="openshift-must-gather-ltcrl/crc-debug-tsfvh" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.765696 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dfnv\" (UniqueName: \"kubernetes.io/projected/df099413-bd8b-4037-89d4-60155f99f19e-kube-api-access-9dfnv\") pod \"crc-debug-tsfvh\" (UID: \"df099413-bd8b-4037-89d4-60155f99f19e\") " pod="openshift-must-gather-ltcrl/crc-debug-tsfvh" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.765948 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df099413-bd8b-4037-89d4-60155f99f19e-host\") pod \"crc-debug-tsfvh\" (UID: \"df099413-bd8b-4037-89d4-60155f99f19e\") " pod="openshift-must-gather-ltcrl/crc-debug-tsfvh" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.766781 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f558038-e16a-4aa1-bb7b-ddb6f14987a7-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.766809 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f558038-e16a-4aa1-bb7b-ddb6f14987a7-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.766823 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmktm\" (UniqueName: \"kubernetes.io/projected/4f558038-e16a-4aa1-bb7b-ddb6f14987a7-kube-api-access-lmktm\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.786791 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dfnv\" (UniqueName: \"kubernetes.io/projected/df099413-bd8b-4037-89d4-60155f99f19e-kube-api-access-9dfnv\") pod \"crc-debug-tsfvh\" (UID: \"df099413-bd8b-4037-89d4-60155f99f19e\") " pod="openshift-must-gather-ltcrl/crc-debug-tsfvh" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.935137 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ltcrl/crc-debug-tsfvh" Jan 21 15:02:14 crc kubenswrapper[4720]: W0121 15:02:14.956322 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf099413_bd8b_4037_89d4_60155f99f19e.slice/crio-2339930d951d09813c333822dd382b293faa6aa1833f73cdffaa2cac3ab11910 WatchSource:0}: Error finding container 2339930d951d09813c333822dd382b293faa6aa1833f73cdffaa2cac3ab11910: Status 404 returned error can't find the container with id 2339930d951d09813c333822dd382b293faa6aa1833f73cdffaa2cac3ab11910 Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.027262 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6x2nt"] Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.164127 4720 generic.go:334] "Generic (PLEG): container finished" podID="4f558038-e16a-4aa1-bb7b-ddb6f14987a7" containerID="7bd6838694b592aaa035d463ea22b7f37cf62840e364dfb62000218aab28c0df" exitCode=0 Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.164230 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxdtl" event={"ID":"4f558038-e16a-4aa1-bb7b-ddb6f14987a7","Type":"ContainerDied","Data":"7bd6838694b592aaa035d463ea22b7f37cf62840e364dfb62000218aab28c0df"} Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.164302 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxdtl" event={"ID":"4f558038-e16a-4aa1-bb7b-ddb6f14987a7","Type":"ContainerDied","Data":"ec6aaee5e30b27a55c2206c76a5a2e84eaba6a236f2d843ee4cfb96336a189b9"} Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.164335 4720 scope.go:117] "RemoveContainer" containerID="7bd6838694b592aaa035d463ea22b7f37cf62840e364dfb62000218aab28c0df" Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.164542 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sxdtl" Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.166392 4720 generic.go:334] "Generic (PLEG): container finished" podID="df099413-bd8b-4037-89d4-60155f99f19e" containerID="299c1e617983776558f282014d4b14f16c1ee5a3630b9cf95ccb65cd32d55d37" exitCode=1 Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.166580 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ltcrl/crc-debug-tsfvh" event={"ID":"df099413-bd8b-4037-89d4-60155f99f19e","Type":"ContainerDied","Data":"299c1e617983776558f282014d4b14f16c1ee5a3630b9cf95ccb65cd32d55d37"} Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.166672 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ltcrl/crc-debug-tsfvh" event={"ID":"df099413-bd8b-4037-89d4-60155f99f19e","Type":"ContainerStarted","Data":"2339930d951d09813c333822dd382b293faa6aa1833f73cdffaa2cac3ab11910"} Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.185369 4720 scope.go:117] "RemoveContainer" containerID="a1023d7679500ef93dc8f6651cd620c7b2b00e6d045eb90b1e889e744e4df3ed" Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.215152 4720 scope.go:117] "RemoveContainer" containerID="fdc8c3f9dedb691d525e6b32624691f58b5d6a0081fa1325fe2862476431a992" Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.218846 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sxdtl"] Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.229466 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sxdtl"] Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.238494 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ltcrl/crc-debug-tsfvh"] Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.244207 4720 scope.go:117] "RemoveContainer" containerID="7bd6838694b592aaa035d463ea22b7f37cf62840e364dfb62000218aab28c0df" Jan 21 15:02:15 crc kubenswrapper[4720]: E0121 15:02:15.244536 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bd6838694b592aaa035d463ea22b7f37cf62840e364dfb62000218aab28c0df\": container with ID starting with 7bd6838694b592aaa035d463ea22b7f37cf62840e364dfb62000218aab28c0df not found: ID does not exist" containerID="7bd6838694b592aaa035d463ea22b7f37cf62840e364dfb62000218aab28c0df" Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.244565 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bd6838694b592aaa035d463ea22b7f37cf62840e364dfb62000218aab28c0df"} err="failed to get container status \"7bd6838694b592aaa035d463ea22b7f37cf62840e364dfb62000218aab28c0df\": rpc error: code = NotFound desc = could not find container \"7bd6838694b592aaa035d463ea22b7f37cf62840e364dfb62000218aab28c0df\": container with ID starting with 7bd6838694b592aaa035d463ea22b7f37cf62840e364dfb62000218aab28c0df not found: ID does not exist" Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.244584 4720 scope.go:117] "RemoveContainer" containerID="a1023d7679500ef93dc8f6651cd620c7b2b00e6d045eb90b1e889e744e4df3ed" Jan 21 15:02:15 crc kubenswrapper[4720]: E0121 15:02:15.245050 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1023d7679500ef93dc8f6651cd620c7b2b00e6d045eb90b1e889e744e4df3ed\": container with ID starting with a1023d7679500ef93dc8f6651cd620c7b2b00e6d045eb90b1e889e744e4df3ed not found: ID does not exist" containerID="a1023d7679500ef93dc8f6651cd620c7b2b00e6d045eb90b1e889e744e4df3ed" Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.245075 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1023d7679500ef93dc8f6651cd620c7b2b00e6d045eb90b1e889e744e4df3ed"} err="failed to get container status \"a1023d7679500ef93dc8f6651cd620c7b2b00e6d045eb90b1e889e744e4df3ed\": rpc error: code = NotFound desc = could not find container \"a1023d7679500ef93dc8f6651cd620c7b2b00e6d045eb90b1e889e744e4df3ed\": container with ID starting with a1023d7679500ef93dc8f6651cd620c7b2b00e6d045eb90b1e889e744e4df3ed not found: ID does not exist" Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.245089 4720 scope.go:117] "RemoveContainer" containerID="fdc8c3f9dedb691d525e6b32624691f58b5d6a0081fa1325fe2862476431a992" Jan 21 15:02:15 crc kubenswrapper[4720]: E0121 15:02:15.245611 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdc8c3f9dedb691d525e6b32624691f58b5d6a0081fa1325fe2862476431a992\": container with ID starting with fdc8c3f9dedb691d525e6b32624691f58b5d6a0081fa1325fe2862476431a992 not found: ID does not exist" containerID="fdc8c3f9dedb691d525e6b32624691f58b5d6a0081fa1325fe2862476431a992" Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.245686 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdc8c3f9dedb691d525e6b32624691f58b5d6a0081fa1325fe2862476431a992"} err="failed to get container status \"fdc8c3f9dedb691d525e6b32624691f58b5d6a0081fa1325fe2862476431a992\": rpc error: code = NotFound desc = could not find container \"fdc8c3f9dedb691d525e6b32624691f58b5d6a0081fa1325fe2862476431a992\": container with ID starting with fdc8c3f9dedb691d525e6b32624691f58b5d6a0081fa1325fe2862476431a992 not found: ID does not exist" Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.247164 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ltcrl/crc-debug-tsfvh"] Jan 21 15:02:16 crc kubenswrapper[4720]: I0121 15:02:16.174445 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6x2nt" podUID="26e526e0-a293-4e24-a0b3-cc7fa0e9308b" containerName="registry-server" containerID="cri-o://ca3be57ac007ed5872cbc69b80c1e3b190a6a88999004e016e12d63ec17af851" gracePeriod=2 Jan 21 15:02:16 crc kubenswrapper[4720]: I0121 15:02:16.356113 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ltcrl/crc-debug-tsfvh" Jan 21 15:02:16 crc kubenswrapper[4720]: I0121 15:02:16.396469 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df099413-bd8b-4037-89d4-60155f99f19e-host\") pod \"df099413-bd8b-4037-89d4-60155f99f19e\" (UID: \"df099413-bd8b-4037-89d4-60155f99f19e\") " Jan 21 15:02:16 crc kubenswrapper[4720]: I0121 15:02:16.396560 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dfnv\" (UniqueName: \"kubernetes.io/projected/df099413-bd8b-4037-89d4-60155f99f19e-kube-api-access-9dfnv\") pod \"df099413-bd8b-4037-89d4-60155f99f19e\" (UID: \"df099413-bd8b-4037-89d4-60155f99f19e\") " Jan 21 15:02:16 crc kubenswrapper[4720]: I0121 15:02:16.397684 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df099413-bd8b-4037-89d4-60155f99f19e-host" (OuterVolumeSpecName: "host") pod "df099413-bd8b-4037-89d4-60155f99f19e" (UID: "df099413-bd8b-4037-89d4-60155f99f19e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:02:16 crc kubenswrapper[4720]: I0121 15:02:16.419917 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df099413-bd8b-4037-89d4-60155f99f19e-kube-api-access-9dfnv" (OuterVolumeSpecName: "kube-api-access-9dfnv") pod "df099413-bd8b-4037-89d4-60155f99f19e" (UID: "df099413-bd8b-4037-89d4-60155f99f19e"). InnerVolumeSpecName "kube-api-access-9dfnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:16 crc kubenswrapper[4720]: I0121 15:02:16.504522 4720 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df099413-bd8b-4037-89d4-60155f99f19e-host\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:16 crc kubenswrapper[4720]: I0121 15:02:16.504567 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dfnv\" (UniqueName: \"kubernetes.io/projected/df099413-bd8b-4037-89d4-60155f99f19e-kube-api-access-9dfnv\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:16 crc kubenswrapper[4720]: I0121 15:02:16.660905 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6x2nt" Jan 21 15:02:16 crc kubenswrapper[4720]: I0121 15:02:16.688812 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f558038-e16a-4aa1-bb7b-ddb6f14987a7" path="/var/lib/kubelet/pods/4f558038-e16a-4aa1-bb7b-ddb6f14987a7/volumes" Jan 21 15:02:16 crc kubenswrapper[4720]: I0121 15:02:16.689556 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df099413-bd8b-4037-89d4-60155f99f19e" path="/var/lib/kubelet/pods/df099413-bd8b-4037-89d4-60155f99f19e/volumes" Jan 21 15:02:16 crc kubenswrapper[4720]: I0121 15:02:16.707112 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5vdk\" (UniqueName: \"kubernetes.io/projected/26e526e0-a293-4e24-a0b3-cc7fa0e9308b-kube-api-access-l5vdk\") pod \"26e526e0-a293-4e24-a0b3-cc7fa0e9308b\" (UID: \"26e526e0-a293-4e24-a0b3-cc7fa0e9308b\") " Jan 21 15:02:16 crc kubenswrapper[4720]: I0121 15:02:16.707325 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26e526e0-a293-4e24-a0b3-cc7fa0e9308b-utilities\") pod \"26e526e0-a293-4e24-a0b3-cc7fa0e9308b\" (UID: \"26e526e0-a293-4e24-a0b3-cc7fa0e9308b\") " Jan 21 15:02:16 crc kubenswrapper[4720]: I0121 15:02:16.707383 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26e526e0-a293-4e24-a0b3-cc7fa0e9308b-catalog-content\") pod \"26e526e0-a293-4e24-a0b3-cc7fa0e9308b\" (UID: \"26e526e0-a293-4e24-a0b3-cc7fa0e9308b\") " Jan 21 15:02:16 crc kubenswrapper[4720]: I0121 15:02:16.721411 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26e526e0-a293-4e24-a0b3-cc7fa0e9308b-kube-api-access-l5vdk" (OuterVolumeSpecName: "kube-api-access-l5vdk") pod "26e526e0-a293-4e24-a0b3-cc7fa0e9308b" (UID: "26e526e0-a293-4e24-a0b3-cc7fa0e9308b"). InnerVolumeSpecName "kube-api-access-l5vdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:16 crc kubenswrapper[4720]: I0121 15:02:16.726339 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26e526e0-a293-4e24-a0b3-cc7fa0e9308b-utilities" (OuterVolumeSpecName: "utilities") pod "26e526e0-a293-4e24-a0b3-cc7fa0e9308b" (UID: "26e526e0-a293-4e24-a0b3-cc7fa0e9308b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:02:16 crc kubenswrapper[4720]: I0121 15:02:16.810556 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5vdk\" (UniqueName: \"kubernetes.io/projected/26e526e0-a293-4e24-a0b3-cc7fa0e9308b-kube-api-access-l5vdk\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:16 crc kubenswrapper[4720]: I0121 15:02:16.810601 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26e526e0-a293-4e24-a0b3-cc7fa0e9308b-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:16 crc kubenswrapper[4720]: I0121 15:02:16.881198 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26e526e0-a293-4e24-a0b3-cc7fa0e9308b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "26e526e0-a293-4e24-a0b3-cc7fa0e9308b" (UID: "26e526e0-a293-4e24-a0b3-cc7fa0e9308b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:02:16 crc kubenswrapper[4720]: I0121 15:02:16.912945 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26e526e0-a293-4e24-a0b3-cc7fa0e9308b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:17 crc kubenswrapper[4720]: I0121 15:02:17.184370 4720 scope.go:117] "RemoveContainer" containerID="299c1e617983776558f282014d4b14f16c1ee5a3630b9cf95ccb65cd32d55d37" Jan 21 15:02:17 crc kubenswrapper[4720]: I0121 15:02:17.184385 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ltcrl/crc-debug-tsfvh" Jan 21 15:02:17 crc kubenswrapper[4720]: I0121 15:02:17.186722 4720 generic.go:334] "Generic (PLEG): container finished" podID="26e526e0-a293-4e24-a0b3-cc7fa0e9308b" containerID="ca3be57ac007ed5872cbc69b80c1e3b190a6a88999004e016e12d63ec17af851" exitCode=0 Jan 21 15:02:17 crc kubenswrapper[4720]: I0121 15:02:17.186772 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6x2nt" Jan 21 15:02:17 crc kubenswrapper[4720]: I0121 15:02:17.186771 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6x2nt" event={"ID":"26e526e0-a293-4e24-a0b3-cc7fa0e9308b","Type":"ContainerDied","Data":"ca3be57ac007ed5872cbc69b80c1e3b190a6a88999004e016e12d63ec17af851"} Jan 21 15:02:17 crc kubenswrapper[4720]: I0121 15:02:17.187490 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6x2nt" event={"ID":"26e526e0-a293-4e24-a0b3-cc7fa0e9308b","Type":"ContainerDied","Data":"50874e76ea4107b7c07d6f1ccee98b03d59ab44fcfb2f73925bdd79450642dc9"} Jan 21 15:02:17 crc kubenswrapper[4720]: I0121 15:02:17.219885 4720 scope.go:117] "RemoveContainer" containerID="ca3be57ac007ed5872cbc69b80c1e3b190a6a88999004e016e12d63ec17af851" Jan 21 15:02:17 crc kubenswrapper[4720]: I0121 15:02:17.232145 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6x2nt"] Jan 21 15:02:17 crc kubenswrapper[4720]: I0121 15:02:17.240469 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6x2nt"] Jan 21 15:02:17 crc kubenswrapper[4720]: I0121 15:02:17.242562 4720 scope.go:117] "RemoveContainer" containerID="6bb1fadb296f4f7a4e177b677ea368708667b26d03ffd5e4381a0ce2b42c380d" Jan 21 15:02:17 crc kubenswrapper[4720]: I0121 15:02:17.261905 4720 scope.go:117] "RemoveContainer" containerID="a4664b70456dd4409bc474ee18c3c107d7f7a37fd01804bf2b091e5aafe215e7" Jan 21 15:02:17 crc kubenswrapper[4720]: I0121 15:02:17.278577 4720 scope.go:117] "RemoveContainer" containerID="ca3be57ac007ed5872cbc69b80c1e3b190a6a88999004e016e12d63ec17af851" Jan 21 15:02:17 crc kubenswrapper[4720]: E0121 15:02:17.278964 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca3be57ac007ed5872cbc69b80c1e3b190a6a88999004e016e12d63ec17af851\": container with ID starting with ca3be57ac007ed5872cbc69b80c1e3b190a6a88999004e016e12d63ec17af851 not found: ID does not exist" containerID="ca3be57ac007ed5872cbc69b80c1e3b190a6a88999004e016e12d63ec17af851" Jan 21 15:02:17 crc kubenswrapper[4720]: I0121 15:02:17.279004 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca3be57ac007ed5872cbc69b80c1e3b190a6a88999004e016e12d63ec17af851"} err="failed to get container status \"ca3be57ac007ed5872cbc69b80c1e3b190a6a88999004e016e12d63ec17af851\": rpc error: code = NotFound desc = could not find container \"ca3be57ac007ed5872cbc69b80c1e3b190a6a88999004e016e12d63ec17af851\": container with ID starting with ca3be57ac007ed5872cbc69b80c1e3b190a6a88999004e016e12d63ec17af851 not found: ID does not exist" Jan 21 15:02:17 crc kubenswrapper[4720]: I0121 15:02:17.279023 4720 scope.go:117] "RemoveContainer" containerID="6bb1fadb296f4f7a4e177b677ea368708667b26d03ffd5e4381a0ce2b42c380d" Jan 21 15:02:17 crc kubenswrapper[4720]: E0121 15:02:17.279287 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bb1fadb296f4f7a4e177b677ea368708667b26d03ffd5e4381a0ce2b42c380d\": container with ID starting with 6bb1fadb296f4f7a4e177b677ea368708667b26d03ffd5e4381a0ce2b42c380d not found: ID does not exist" containerID="6bb1fadb296f4f7a4e177b677ea368708667b26d03ffd5e4381a0ce2b42c380d" Jan 21 15:02:17 crc kubenswrapper[4720]: I0121 15:02:17.279355 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bb1fadb296f4f7a4e177b677ea368708667b26d03ffd5e4381a0ce2b42c380d"} err="failed to get container status \"6bb1fadb296f4f7a4e177b677ea368708667b26d03ffd5e4381a0ce2b42c380d\": rpc error: code = NotFound desc = could not find container \"6bb1fadb296f4f7a4e177b677ea368708667b26d03ffd5e4381a0ce2b42c380d\": container with ID starting with 6bb1fadb296f4f7a4e177b677ea368708667b26d03ffd5e4381a0ce2b42c380d not found: ID does not exist" Jan 21 15:02:17 crc kubenswrapper[4720]: I0121 15:02:17.279392 4720 scope.go:117] "RemoveContainer" containerID="a4664b70456dd4409bc474ee18c3c107d7f7a37fd01804bf2b091e5aafe215e7" Jan 21 15:02:17 crc kubenswrapper[4720]: E0121 15:02:17.279706 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4664b70456dd4409bc474ee18c3c107d7f7a37fd01804bf2b091e5aafe215e7\": container with ID starting with a4664b70456dd4409bc474ee18c3c107d7f7a37fd01804bf2b091e5aafe215e7 not found: ID does not exist" containerID="a4664b70456dd4409bc474ee18c3c107d7f7a37fd01804bf2b091e5aafe215e7" Jan 21 15:02:17 crc kubenswrapper[4720]: I0121 15:02:17.279755 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4664b70456dd4409bc474ee18c3c107d7f7a37fd01804bf2b091e5aafe215e7"} err="failed to get container status \"a4664b70456dd4409bc474ee18c3c107d7f7a37fd01804bf2b091e5aafe215e7\": rpc error: code = NotFound desc = could not find container \"a4664b70456dd4409bc474ee18c3c107d7f7a37fd01804bf2b091e5aafe215e7\": container with ID starting with a4664b70456dd4409bc474ee18c3c107d7f7a37fd01804bf2b091e5aafe215e7 not found: ID does not exist" Jan 21 15:02:18 crc kubenswrapper[4720]: I0121 15:02:18.688951 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26e526e0-a293-4e24-a0b3-cc7fa0e9308b" path="/var/lib/kubelet/pods/26e526e0-a293-4e24-a0b3-cc7fa0e9308b/volumes" Jan 21 15:02:23 crc kubenswrapper[4720]: I0121 15:02:23.678189 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 15:02:24 crc kubenswrapper[4720]: I0121 15:02:24.244053 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerStarted","Data":"3bee635cc2c3c335bc129c259a16f2476ca04810986abca3de29789dac0840b7"} Jan 21 15:02:30 crc kubenswrapper[4720]: I0121 15:02:30.711417 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7ddb5c749-q2t2m_655f8c6a-4936-45d3-9538-66ee77a050d3/manager/0.log" Jan 21 15:02:30 crc kubenswrapper[4720]: I0121 15:02:30.745407 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g_533f904c-bfa5-42e7-a907-5fe372443d20/extract/0.log" Jan 21 15:02:30 crc kubenswrapper[4720]: I0121 15:02:30.755551 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g_533f904c-bfa5-42e7-a907-5fe372443d20/util/0.log" Jan 21 15:02:30 crc kubenswrapper[4720]: I0121 15:02:30.765741 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g_533f904c-bfa5-42e7-a907-5fe372443d20/pull/0.log" Jan 21 15:02:30 crc kubenswrapper[4720]: I0121 15:02:30.814461 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-9b68f5989-wnzfm_b7ea6739-9c38-44a0-a382-8b26e37138fa/manager/0.log" Jan 21 15:02:30 crc kubenswrapper[4720]: I0121 15:02:30.827346 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-9f958b845-bjn2r_96218341-1cf7-4aa1-bb9a-7a7abba7a93e/manager/0.log" Jan 21 15:02:30 crc kubenswrapper[4720]: I0121 15:02:30.873630 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-c6994669c-gwlgm_6c93648a-7076-4d91-ac7a-f389ab1159cc/manager/0.log" Jan 21 15:02:30 crc kubenswrapper[4720]: I0121 15:02:30.886050 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-594c8c9d5d-bl4z8_9a5569f7-371f-4663-b005-5fdcce36936b/manager/0.log" Jan 21 15:02:30 crc kubenswrapper[4720]: I0121 15:02:30.901885 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-77d5c5b54f-vfxfh_071d4469-5b09-49a3-97f4-239d811825a2/manager/0.log" Jan 21 15:02:31 crc kubenswrapper[4720]: I0121 15:02:31.145036 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-77c48c7859-xtpbn_b80cffaf-5853-47ac-b783-c26da64425ff/manager/0.log" Jan 21 15:02:31 crc kubenswrapper[4720]: I0121 15:02:31.156968 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-78757b4889-glbt4_9b467fa8-1984-4659-8873-99c20204b16b/manager/0.log" Jan 21 15:02:31 crc kubenswrapper[4720]: I0121 15:02:31.215728 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-767fdc4f47-54hwg_085a2e93-1496-47f3-a7dc-4acae2e201fc/manager/0.log" Jan 21 15:02:31 crc kubenswrapper[4720]: I0121 15:02:31.227821 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-864f6b75bf-n5bwd_370e5a87-5edf-4d48-9b65-335400a84cd2/manager/0.log" Jan 21 15:02:31 crc kubenswrapper[4720]: I0121 15:02:31.266836 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-c87fff755-v4fbm_589a442f-27a6-4d23-85dd-9e5b1556363f/manager/0.log" Jan 21 15:02:31 crc kubenswrapper[4720]: I0121 15:02:31.307896 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-cb4666565-d22bk_c38df2a4-6626-4b71-9dcd-7ef3003ee693/manager/0.log" Jan 21 15:02:31 crc kubenswrapper[4720]: I0121 15:02:31.388708 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-65849867d6-vzzmp_bb4052fb-6d6d-4e2a-b2f7-ee94c97512b5/manager/0.log" Jan 21 15:02:31 crc kubenswrapper[4720]: I0121 15:02:31.398812 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7fc9b76cf6-pw4z6_9695fd09-d135-426b-a129-66f945d2dd90/manager/0.log" Jan 21 15:02:31 crc kubenswrapper[4720]: I0121 15:02:31.421606 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw_88e81fdb-6501-410c-9452-d3ba7f41a30d/manager/0.log" Jan 21 15:02:31 crc kubenswrapper[4720]: I0121 15:02:31.582863 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-68fc899677-pbmmn_d3800217-b53a-4788-a9d4-8861cfdb68a1/operator/0.log" Jan 21 15:02:32 crc kubenswrapper[4720]: I0121 15:02:32.409542 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-d47656bc9-4hjmr_eb81b686-832a-414b-aa66-cf40a72a7427/manager/0.log" Jan 21 15:02:32 crc kubenswrapper[4720]: I0121 15:02:32.416996 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-j4xn9_5d59157d-f538-4cb0-959d-11584d7678c5/registry-server/0.log" Jan 21 15:02:32 crc kubenswrapper[4720]: I0121 15:02:32.465604 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-55db956ddc-689zh_88327b24-ce00-4bb4-98d1-24060c6dbf28/manager/0.log" Jan 21 15:02:32 crc kubenswrapper[4720]: I0121 15:02:32.487860 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-686df47fcb-2clln_18ce7f0d-00de-4a92-97f2-743d9057abff/manager/0.log" Jan 21 15:02:32 crc kubenswrapper[4720]: I0121 15:02:32.510410 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-mm7cg_8db4bced-5679-43ab-a5c9-ba87574aaa02/operator/0.log" Jan 21 15:02:32 crc kubenswrapper[4720]: I0121 15:02:32.518683 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-85dd56d4cc-4tjlt_a2557af5-c155-4d37-9b9a-f9335cac47b1/manager/0.log" Jan 21 15:02:32 crc kubenswrapper[4720]: I0121 15:02:32.604872 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5f8f495fcf-8hrkh_a050e31c-3d6d-490c-8f74-637c37c96a5e/manager/0.log" Jan 21 15:02:32 crc kubenswrapper[4720]: I0121 15:02:32.616847 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7cd8bc9dbb-xczlv_cd17e86c-5586-4ea9-979d-2c195494fe99/manager/0.log" Jan 21 15:02:32 crc kubenswrapper[4720]: I0121 15:02:32.627438 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-64cd966744-jfkfq_de2e9655-961c-4250-9852-332dfe335b4a/manager/0.log" Jan 21 15:02:37 crc kubenswrapper[4720]: I0121 15:02:37.429294 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-jtj6g_48af697e-308a-4bdd-a5d8-d86cd5c4fb0c/control-plane-machine-set-operator/0.log" Jan 21 15:02:37 crc kubenswrapper[4720]: I0121 15:02:37.445848 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-h9ckd_1a75d5de-a507-41ca-8206-eae702d16020/kube-rbac-proxy/0.log" Jan 21 15:02:37 crc kubenswrapper[4720]: I0121 15:02:37.451426 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-h9ckd_1a75d5de-a507-41ca-8206-eae702d16020/machine-api-operator/0.log" Jan 21 15:02:43 crc kubenswrapper[4720]: I0121 15:02:43.093479 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-d6jp2_4eec0898-8a1a-47d9-ac37-62cfe6c7b857/cert-manager-controller/0.log" Jan 21 15:02:43 crc kubenswrapper[4720]: I0121 15:02:43.105695 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-c4tn5_4939bfdd-b3b4-4850-8b5d-3399548ad5a0/cert-manager-cainjector/0.log" Jan 21 15:02:43 crc kubenswrapper[4720]: I0121 15:02:43.112420 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-vflwv_0236eaa4-e5d8-4699-82f8-1e9648f95dc8/cert-manager-webhook/0.log" Jan 21 15:02:48 crc kubenswrapper[4720]: I0121 15:02:48.080601 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-f9sxz_e3d11ff0-1741-4f0d-aa50-6e0144e843a6/nmstate-console-plugin/0.log" Jan 21 15:02:48 crc kubenswrapper[4720]: I0121 15:02:48.099333 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-l74mh_da16493b-aa03-4556-b3ce-d87ccfdbba70/nmstate-handler/0.log" Jan 21 15:02:48 crc kubenswrapper[4720]: I0121 15:02:48.111922 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-j9dxt_a26c9332-5a74-49a3-8347-45ae67cb1c90/nmstate-metrics/0.log" Jan 21 15:02:48 crc kubenswrapper[4720]: I0121 15:02:48.122476 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-j9dxt_a26c9332-5a74-49a3-8347-45ae67cb1c90/kube-rbac-proxy/0.log" Jan 21 15:02:48 crc kubenswrapper[4720]: I0121 15:02:48.136301 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-mclmr_2bdd7be0-b9cf-4501-9816-87831d74becc/nmstate-operator/0.log" Jan 21 15:02:48 crc kubenswrapper[4720]: I0121 15:02:48.146559 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-xcckr_c338dc84-0c3a-44c4-8f08-82001f532c2b/nmstate-webhook/0.log" Jan 21 15:02:59 crc kubenswrapper[4720]: I0121 15:02:59.103346 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-72sfn_51379103-8c08-45c6-a0f3-86928d43bd50/controller/0.log" Jan 21 15:02:59 crc kubenswrapper[4720]: I0121 15:02:59.110725 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-72sfn_51379103-8c08-45c6-a0f3-86928d43bd50/kube-rbac-proxy/0.log" Jan 21 15:02:59 crc kubenswrapper[4720]: I0121 15:02:59.135783 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/controller/0.log" Jan 21 15:03:00 crc kubenswrapper[4720]: I0121 15:03:00.040801 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/frr/0.log" Jan 21 15:03:00 crc kubenswrapper[4720]: I0121 15:03:00.056605 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/reloader/0.log" Jan 21 15:03:00 crc kubenswrapper[4720]: I0121 15:03:00.062613 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/frr-metrics/0.log" Jan 21 15:03:00 crc kubenswrapper[4720]: I0121 15:03:00.067565 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/kube-rbac-proxy/0.log" Jan 21 15:03:00 crc kubenswrapper[4720]: I0121 15:03:00.079448 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/kube-rbac-proxy-frr/0.log" Jan 21 15:03:00 crc kubenswrapper[4720]: I0121 15:03:00.084004 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/cp-frr-files/0.log" Jan 21 15:03:00 crc kubenswrapper[4720]: I0121 15:03:00.092591 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/cp-reloader/0.log" Jan 21 15:03:00 crc kubenswrapper[4720]: I0121 15:03:00.100743 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/cp-metrics/0.log" Jan 21 15:03:00 crc kubenswrapper[4720]: I0121 15:03:00.120418 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-lsrs9_8ba45f1e-4559-4408-b129-b061d406fce6/frr-k8s-webhook-server/0.log" Jan 21 15:03:00 crc kubenswrapper[4720]: I0121 15:03:00.145013 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7b8c8cff46-cbv67_b6fdd799-fe82-4cd7-b825-c755b6189180/manager/0.log" Jan 21 15:03:00 crc kubenswrapper[4720]: I0121 15:03:00.154326 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-75df998c5f-tnbdz_6c334ce5-b6c7-40c8-a261-5a5084ae3db8/webhook-server/0.log" Jan 21 15:03:00 crc kubenswrapper[4720]: I0121 15:03:00.421319 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-m7fv6_49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1/speaker/0.log" Jan 21 15:03:00 crc kubenswrapper[4720]: I0121 15:03:00.434721 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-m7fv6_49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1/kube-rbac-proxy/0.log" Jan 21 15:03:05 crc kubenswrapper[4720]: I0121 15:03:05.317803 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb_93611686-cfcc-4f9b-985d-a8e0d9cb7219/extract/0.log" Jan 21 15:03:05 crc kubenswrapper[4720]: I0121 15:03:05.327868 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb_93611686-cfcc-4f9b-985d-a8e0d9cb7219/util/0.log" Jan 21 15:03:05 crc kubenswrapper[4720]: I0121 15:03:05.342552 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb_93611686-cfcc-4f9b-985d-a8e0d9cb7219/pull/0.log" Jan 21 15:03:05 crc kubenswrapper[4720]: I0121 15:03:05.356217 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw_d714bdab-c0dc-4710-bae5-ec08841d2c0d/extract/0.log" Jan 21 15:03:05 crc kubenswrapper[4720]: I0121 15:03:05.368638 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw_d714bdab-c0dc-4710-bae5-ec08841d2c0d/util/0.log" Jan 21 15:03:05 crc kubenswrapper[4720]: I0121 15:03:05.392967 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw_d714bdab-c0dc-4710-bae5-ec08841d2c0d/pull/0.log" Jan 21 15:03:05 crc kubenswrapper[4720]: I0121 15:03:05.646722 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kb2c7_c9a5b258-9d31-4031-85f0-1c8d00da3dda/registry-server/0.log" Jan 21 15:03:05 crc kubenswrapper[4720]: I0121 15:03:05.652866 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kb2c7_c9a5b258-9d31-4031-85f0-1c8d00da3dda/extract-utilities/0.log" Jan 21 15:03:05 crc kubenswrapper[4720]: I0121 15:03:05.670481 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kb2c7_c9a5b258-9d31-4031-85f0-1c8d00da3dda/extract-content/0.log" Jan 21 15:03:06 crc kubenswrapper[4720]: I0121 15:03:06.007963 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bqrkw_f9a3c893-2903-4355-9af3-b8f981477494/registry-server/0.log" Jan 21 15:03:06 crc kubenswrapper[4720]: I0121 15:03:06.012919 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bqrkw_f9a3c893-2903-4355-9af3-b8f981477494/extract-utilities/0.log" Jan 21 15:03:06 crc kubenswrapper[4720]: I0121 15:03:06.018571 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bqrkw_f9a3c893-2903-4355-9af3-b8f981477494/extract-content/0.log" Jan 21 15:03:06 crc kubenswrapper[4720]: I0121 15:03:06.031598 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-s9hd2_fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c/marketplace-operator/0.log" Jan 21 15:03:06 crc kubenswrapper[4720]: I0121 15:03:06.174142 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7fb4w_1f47a635-f04f-4002-a264-f10be8c70e10/registry-server/0.log" Jan 21 15:03:06 crc kubenswrapper[4720]: I0121 15:03:06.194300 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7fb4w_1f47a635-f04f-4002-a264-f10be8c70e10/extract-utilities/0.log" Jan 21 15:03:06 crc kubenswrapper[4720]: I0121 15:03:06.203780 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7fb4w_1f47a635-f04f-4002-a264-f10be8c70e10/extract-content/0.log" Jan 21 15:03:06 crc kubenswrapper[4720]: I0121 15:03:06.558957 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4hxc8_86ba467d-dfbe-493b-acf6-17b938a753b0/registry-server/0.log" Jan 21 15:03:06 crc kubenswrapper[4720]: I0121 15:03:06.564575 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4hxc8_86ba467d-dfbe-493b-acf6-17b938a753b0/extract-utilities/0.log" Jan 21 15:03:06 crc kubenswrapper[4720]: I0121 15:03:06.572251 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4hxc8_86ba467d-dfbe-493b-acf6-17b938a753b0/extract-content/0.log" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.711998 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vlw79"] Jan 21 15:03:22 crc kubenswrapper[4720]: E0121 15:03:22.712932 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df099413-bd8b-4037-89d4-60155f99f19e" containerName="container-00" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.712951 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="df099413-bd8b-4037-89d4-60155f99f19e" containerName="container-00" Jan 21 15:03:22 crc kubenswrapper[4720]: E0121 15:03:22.712983 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26e526e0-a293-4e24-a0b3-cc7fa0e9308b" containerName="registry-server" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.712993 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="26e526e0-a293-4e24-a0b3-cc7fa0e9308b" containerName="registry-server" Jan 21 15:03:22 crc kubenswrapper[4720]: E0121 15:03:22.713006 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f558038-e16a-4aa1-bb7b-ddb6f14987a7" containerName="extract-utilities" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.713016 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f558038-e16a-4aa1-bb7b-ddb6f14987a7" containerName="extract-utilities" Jan 21 15:03:22 crc kubenswrapper[4720]: E0121 15:03:22.713025 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f558038-e16a-4aa1-bb7b-ddb6f14987a7" containerName="extract-content" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.713032 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f558038-e16a-4aa1-bb7b-ddb6f14987a7" containerName="extract-content" Jan 21 15:03:22 crc kubenswrapper[4720]: E0121 15:03:22.713051 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26e526e0-a293-4e24-a0b3-cc7fa0e9308b" containerName="extract-content" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.713060 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="26e526e0-a293-4e24-a0b3-cc7fa0e9308b" containerName="extract-content" Jan 21 15:03:22 crc kubenswrapper[4720]: E0121 15:03:22.713075 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26e526e0-a293-4e24-a0b3-cc7fa0e9308b" containerName="extract-utilities" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.713082 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="26e526e0-a293-4e24-a0b3-cc7fa0e9308b" containerName="extract-utilities" Jan 21 15:03:22 crc kubenswrapper[4720]: E0121 15:03:22.713095 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f558038-e16a-4aa1-bb7b-ddb6f14987a7" containerName="registry-server" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.713103 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f558038-e16a-4aa1-bb7b-ddb6f14987a7" containerName="registry-server" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.713336 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="26e526e0-a293-4e24-a0b3-cc7fa0e9308b" containerName="registry-server" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.713366 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="df099413-bd8b-4037-89d4-60155f99f19e" containerName="container-00" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.713380 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f558038-e16a-4aa1-bb7b-ddb6f14987a7" containerName="registry-server" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.715122 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vlw79" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.725677 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vlw79"] Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.779524 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz52g\" (UniqueName: \"kubernetes.io/projected/bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4-kube-api-access-pz52g\") pod \"community-operators-vlw79\" (UID: \"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4\") " pod="openshift-marketplace/community-operators-vlw79" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.779955 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4-catalog-content\") pod \"community-operators-vlw79\" (UID: \"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4\") " pod="openshift-marketplace/community-operators-vlw79" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.780015 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4-utilities\") pod \"community-operators-vlw79\" (UID: \"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4\") " pod="openshift-marketplace/community-operators-vlw79" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.881675 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz52g\" (UniqueName: \"kubernetes.io/projected/bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4-kube-api-access-pz52g\") pod \"community-operators-vlw79\" (UID: \"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4\") " pod="openshift-marketplace/community-operators-vlw79" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.881753 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4-catalog-content\") pod \"community-operators-vlw79\" (UID: \"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4\") " pod="openshift-marketplace/community-operators-vlw79" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.881778 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4-utilities\") pod \"community-operators-vlw79\" (UID: \"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4\") " pod="openshift-marketplace/community-operators-vlw79" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.882280 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4-utilities\") pod \"community-operators-vlw79\" (UID: \"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4\") " pod="openshift-marketplace/community-operators-vlw79" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.882496 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4-catalog-content\") pod \"community-operators-vlw79\" (UID: \"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4\") " pod="openshift-marketplace/community-operators-vlw79" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.903240 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz52g\" (UniqueName: \"kubernetes.io/projected/bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4-kube-api-access-pz52g\") pod \"community-operators-vlw79\" (UID: \"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4\") " pod="openshift-marketplace/community-operators-vlw79" Jan 21 15:03:23 crc kubenswrapper[4720]: I0121 15:03:23.037907 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vlw79" Jan 21 15:03:23 crc kubenswrapper[4720]: I0121 15:03:23.598693 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vlw79"] Jan 21 15:03:23 crc kubenswrapper[4720]: I0121 15:03:23.736392 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vlw79" event={"ID":"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4","Type":"ContainerStarted","Data":"0d606b6c4d92032c697bb7849cf5732036ee5f28ade04eb215c43e2f1cfdd6be"} Jan 21 15:03:24 crc kubenswrapper[4720]: I0121 15:03:24.750034 4720 generic.go:334] "Generic (PLEG): container finished" podID="bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4" containerID="977744b7bf9183763fc0fbfe01571d5a8dfc1595622586d10028ba776c5ca735" exitCode=0 Jan 21 15:03:24 crc kubenswrapper[4720]: I0121 15:03:24.751444 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vlw79" event={"ID":"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4","Type":"ContainerDied","Data":"977744b7bf9183763fc0fbfe01571d5a8dfc1595622586d10028ba776c5ca735"} Jan 21 15:03:26 crc kubenswrapper[4720]: I0121 15:03:26.770298 4720 generic.go:334] "Generic (PLEG): container finished" podID="bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4" containerID="aec074b3448f19a406451b31f8106a118df1722470ad7209f21e49b82cc05330" exitCode=0 Jan 21 15:03:26 crc kubenswrapper[4720]: I0121 15:03:26.770338 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vlw79" event={"ID":"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4","Type":"ContainerDied","Data":"aec074b3448f19a406451b31f8106a118df1722470ad7209f21e49b82cc05330"} Jan 21 15:03:27 crc kubenswrapper[4720]: I0121 15:03:27.795053 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vlw79" event={"ID":"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4","Type":"ContainerStarted","Data":"0d46e2daa19e7deb8759de3621890012a927a9fe20a077c93d91e0b0c39a3b33"} Jan 21 15:03:30 crc kubenswrapper[4720]: I0121 15:03:30.681787 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vlw79" podStartSLOduration=6.014809144 podStartE2EDuration="8.681763889s" podCreationTimestamp="2026-01-21 15:03:22 +0000 UTC" firstStartedPulling="2026-01-21 15:03:24.753083393 +0000 UTC m=+2042.661823325" lastFinishedPulling="2026-01-21 15:03:27.420038128 +0000 UTC m=+2045.328778070" observedRunningTime="2026-01-21 15:03:27.824294195 +0000 UTC m=+2045.733034137" watchObservedRunningTime="2026-01-21 15:03:30.681763889 +0000 UTC m=+2048.590503831" Jan 21 15:03:30 crc kubenswrapper[4720]: I0121 15:03:30.690773 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bcpwx"] Jan 21 15:03:30 crc kubenswrapper[4720]: I0121 15:03:30.693062 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bcpwx" Jan 21 15:03:30 crc kubenswrapper[4720]: I0121 15:03:30.705643 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bcpwx"] Jan 21 15:03:30 crc kubenswrapper[4720]: I0121 15:03:30.744773 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgkkm\" (UniqueName: \"kubernetes.io/projected/2bbd360e-7396-4ec2-bc33-d4c909b4c7e4-kube-api-access-sgkkm\") pod \"certified-operators-bcpwx\" (UID: \"2bbd360e-7396-4ec2-bc33-d4c909b4c7e4\") " pod="openshift-marketplace/certified-operators-bcpwx" Jan 21 15:03:30 crc kubenswrapper[4720]: I0121 15:03:30.745020 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bbd360e-7396-4ec2-bc33-d4c909b4c7e4-catalog-content\") pod \"certified-operators-bcpwx\" (UID: \"2bbd360e-7396-4ec2-bc33-d4c909b4c7e4\") " pod="openshift-marketplace/certified-operators-bcpwx" Jan 21 15:03:30 crc kubenswrapper[4720]: I0121 15:03:30.745063 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bbd360e-7396-4ec2-bc33-d4c909b4c7e4-utilities\") pod \"certified-operators-bcpwx\" (UID: \"2bbd360e-7396-4ec2-bc33-d4c909b4c7e4\") " pod="openshift-marketplace/certified-operators-bcpwx" Jan 21 15:03:30 crc kubenswrapper[4720]: I0121 15:03:30.846127 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bbd360e-7396-4ec2-bc33-d4c909b4c7e4-utilities\") pod \"certified-operators-bcpwx\" (UID: \"2bbd360e-7396-4ec2-bc33-d4c909b4c7e4\") " pod="openshift-marketplace/certified-operators-bcpwx" Jan 21 15:03:30 crc kubenswrapper[4720]: I0121 15:03:30.846192 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgkkm\" (UniqueName: \"kubernetes.io/projected/2bbd360e-7396-4ec2-bc33-d4c909b4c7e4-kube-api-access-sgkkm\") pod \"certified-operators-bcpwx\" (UID: \"2bbd360e-7396-4ec2-bc33-d4c909b4c7e4\") " pod="openshift-marketplace/certified-operators-bcpwx" Jan 21 15:03:30 crc kubenswrapper[4720]: I0121 15:03:30.846331 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bbd360e-7396-4ec2-bc33-d4c909b4c7e4-catalog-content\") pod \"certified-operators-bcpwx\" (UID: \"2bbd360e-7396-4ec2-bc33-d4c909b4c7e4\") " pod="openshift-marketplace/certified-operators-bcpwx" Jan 21 15:03:30 crc kubenswrapper[4720]: I0121 15:03:30.846722 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bbd360e-7396-4ec2-bc33-d4c909b4c7e4-utilities\") pod \"certified-operators-bcpwx\" (UID: \"2bbd360e-7396-4ec2-bc33-d4c909b4c7e4\") " pod="openshift-marketplace/certified-operators-bcpwx" Jan 21 15:03:30 crc kubenswrapper[4720]: I0121 15:03:30.846760 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bbd360e-7396-4ec2-bc33-d4c909b4c7e4-catalog-content\") pod \"certified-operators-bcpwx\" (UID: \"2bbd360e-7396-4ec2-bc33-d4c909b4c7e4\") " pod="openshift-marketplace/certified-operators-bcpwx" Jan 21 15:03:30 crc kubenswrapper[4720]: I0121 15:03:30.877016 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgkkm\" (UniqueName: \"kubernetes.io/projected/2bbd360e-7396-4ec2-bc33-d4c909b4c7e4-kube-api-access-sgkkm\") pod \"certified-operators-bcpwx\" (UID: \"2bbd360e-7396-4ec2-bc33-d4c909b4c7e4\") " pod="openshift-marketplace/certified-operators-bcpwx" Jan 21 15:03:31 crc kubenswrapper[4720]: I0121 15:03:31.013912 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bcpwx" Jan 21 15:03:31 crc kubenswrapper[4720]: I0121 15:03:31.681836 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bcpwx"] Jan 21 15:03:31 crc kubenswrapper[4720]: I0121 15:03:31.826911 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bcpwx" event={"ID":"2bbd360e-7396-4ec2-bc33-d4c909b4c7e4","Type":"ContainerStarted","Data":"832a33e027291133d6801641b6adbeaae84d163c91a9f269a3243eee008052ee"} Jan 21 15:03:32 crc kubenswrapper[4720]: I0121 15:03:32.838238 4720 generic.go:334] "Generic (PLEG): container finished" podID="2bbd360e-7396-4ec2-bc33-d4c909b4c7e4" containerID="9f2b6912011fb3da8fb122decfaf75a67e87b0a016def09d17bba5e8af3a0bf9" exitCode=0 Jan 21 15:03:32 crc kubenswrapper[4720]: I0121 15:03:32.838546 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bcpwx" event={"ID":"2bbd360e-7396-4ec2-bc33-d4c909b4c7e4","Type":"ContainerDied","Data":"9f2b6912011fb3da8fb122decfaf75a67e87b0a016def09d17bba5e8af3a0bf9"} Jan 21 15:03:33 crc kubenswrapper[4720]: I0121 15:03:33.039482 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vlw79" Jan 21 15:03:33 crc kubenswrapper[4720]: I0121 15:03:33.039544 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vlw79" Jan 21 15:03:33 crc kubenswrapper[4720]: I0121 15:03:33.093152 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vlw79" Jan 21 15:03:33 crc kubenswrapper[4720]: I0121 15:03:33.911085 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vlw79" Jan 21 15:03:37 crc kubenswrapper[4720]: I0121 15:03:37.883480 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vlw79"] Jan 21 15:03:37 crc kubenswrapper[4720]: I0121 15:03:37.884058 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vlw79" podUID="bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4" containerName="registry-server" containerID="cri-o://0d46e2daa19e7deb8759de3621890012a927a9fe20a077c93d91e0b0c39a3b33" gracePeriod=2 Jan 21 15:03:38 crc kubenswrapper[4720]: I0121 15:03:38.894812 4720 generic.go:334] "Generic (PLEG): container finished" podID="bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4" containerID="0d46e2daa19e7deb8759de3621890012a927a9fe20a077c93d91e0b0c39a3b33" exitCode=0 Jan 21 15:03:38 crc kubenswrapper[4720]: I0121 15:03:38.894889 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vlw79" event={"ID":"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4","Type":"ContainerDied","Data":"0d46e2daa19e7deb8759de3621890012a927a9fe20a077c93d91e0b0c39a3b33"} Jan 21 15:03:41 crc kubenswrapper[4720]: I0121 15:03:41.914069 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vlw79" Jan 21 15:03:41 crc kubenswrapper[4720]: I0121 15:03:41.923504 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vlw79" event={"ID":"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4","Type":"ContainerDied","Data":"0d606b6c4d92032c697bb7849cf5732036ee5f28ade04eb215c43e2f1cfdd6be"} Jan 21 15:03:41 crc kubenswrapper[4720]: I0121 15:03:41.923553 4720 scope.go:117] "RemoveContainer" containerID="0d46e2daa19e7deb8759de3621890012a927a9fe20a077c93d91e0b0c39a3b33" Jan 21 15:03:41 crc kubenswrapper[4720]: I0121 15:03:41.923595 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vlw79" Jan 21 15:03:41 crc kubenswrapper[4720]: I0121 15:03:41.971897 4720 scope.go:117] "RemoveContainer" containerID="aec074b3448f19a406451b31f8106a118df1722470ad7209f21e49b82cc05330" Jan 21 15:03:41 crc kubenswrapper[4720]: I0121 15:03:41.997436 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4-utilities\") pod \"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4\" (UID: \"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4\") " Jan 21 15:03:41 crc kubenswrapper[4720]: I0121 15:03:41.997610 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pz52g\" (UniqueName: \"kubernetes.io/projected/bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4-kube-api-access-pz52g\") pod \"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4\" (UID: \"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4\") " Jan 21 15:03:41 crc kubenswrapper[4720]: I0121 15:03:41.997645 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4-catalog-content\") pod \"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4\" (UID: \"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4\") " Jan 21 15:03:41 crc kubenswrapper[4720]: I0121 15:03:41.999331 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4-utilities" (OuterVolumeSpecName: "utilities") pod "bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4" (UID: "bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:03:42 crc kubenswrapper[4720]: I0121 15:03:42.015076 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4-kube-api-access-pz52g" (OuterVolumeSpecName: "kube-api-access-pz52g") pod "bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4" (UID: "bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4"). InnerVolumeSpecName "kube-api-access-pz52g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:03:42 crc kubenswrapper[4720]: I0121 15:03:42.076882 4720 scope.go:117] "RemoveContainer" containerID="977744b7bf9183763fc0fbfe01571d5a8dfc1595622586d10028ba776c5ca735" Jan 21 15:03:42 crc kubenswrapper[4720]: I0121 15:03:42.099987 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:03:42 crc kubenswrapper[4720]: I0121 15:03:42.100030 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pz52g\" (UniqueName: \"kubernetes.io/projected/bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4-kube-api-access-pz52g\") on node \"crc\" DevicePath \"\"" Jan 21 15:03:42 crc kubenswrapper[4720]: I0121 15:03:42.113583 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4" (UID: "bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:03:42 crc kubenswrapper[4720]: I0121 15:03:42.202244 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:03:42 crc kubenswrapper[4720]: I0121 15:03:42.267027 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vlw79"] Jan 21 15:03:42 crc kubenswrapper[4720]: I0121 15:03:42.274750 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vlw79"] Jan 21 15:03:42 crc kubenswrapper[4720]: E0121 15:03:42.296249 4720 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfad437f_5d3f_4e35_88e2_1ee6a3a4b6e4.slice/crio-0d606b6c4d92032c697bb7849cf5732036ee5f28ade04eb215c43e2f1cfdd6be\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bbd360e_7396_4ec2_bc33_d4c909b4c7e4.slice/crio-2c905405b047deb5d4e5c601c3a7cc9b8fd2ca0129f40a148472618d9dc837db.scope\": RecentStats: unable to find data in memory cache]" Jan 21 15:03:42 crc kubenswrapper[4720]: I0121 15:03:42.688258 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4" path="/var/lib/kubelet/pods/bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4/volumes" Jan 21 15:03:42 crc kubenswrapper[4720]: I0121 15:03:42.936700 4720 generic.go:334] "Generic (PLEG): container finished" podID="2bbd360e-7396-4ec2-bc33-d4c909b4c7e4" containerID="2c905405b047deb5d4e5c601c3a7cc9b8fd2ca0129f40a148472618d9dc837db" exitCode=0 Jan 21 15:03:42 crc kubenswrapper[4720]: I0121 15:03:42.936781 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bcpwx" event={"ID":"2bbd360e-7396-4ec2-bc33-d4c909b4c7e4","Type":"ContainerDied","Data":"2c905405b047deb5d4e5c601c3a7cc9b8fd2ca0129f40a148472618d9dc837db"} Jan 21 15:03:45 crc kubenswrapper[4720]: I0121 15:03:45.963383 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bcpwx" event={"ID":"2bbd360e-7396-4ec2-bc33-d4c909b4c7e4","Type":"ContainerStarted","Data":"2e0f732f0a5c1ef346767faf4b3f12df05889a1ed2ccb85a7b3529777aa53ac6"} Jan 21 15:03:45 crc kubenswrapper[4720]: I0121 15:03:45.990284 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bcpwx" podStartSLOduration=3.989257816 podStartE2EDuration="15.990266509s" podCreationTimestamp="2026-01-21 15:03:30 +0000 UTC" firstStartedPulling="2026-01-21 15:03:32.84003234 +0000 UTC m=+2050.748772272" lastFinishedPulling="2026-01-21 15:03:44.841041033 +0000 UTC m=+2062.749780965" observedRunningTime="2026-01-21 15:03:45.98245057 +0000 UTC m=+2063.891190502" watchObservedRunningTime="2026-01-21 15:03:45.990266509 +0000 UTC m=+2063.899006441" Jan 21 15:03:51 crc kubenswrapper[4720]: I0121 15:03:51.014367 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bcpwx" Jan 21 15:03:51 crc kubenswrapper[4720]: I0121 15:03:51.014972 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bcpwx" Jan 21 15:03:51 crc kubenswrapper[4720]: I0121 15:03:51.065287 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bcpwx" Jan 21 15:03:52 crc kubenswrapper[4720]: I0121 15:03:52.068595 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bcpwx" Jan 21 15:03:52 crc kubenswrapper[4720]: I0121 15:03:52.172713 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bcpwx"] Jan 21 15:03:52 crc kubenswrapper[4720]: I0121 15:03:52.211998 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kb2c7"] Jan 21 15:03:52 crc kubenswrapper[4720]: I0121 15:03:52.212225 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kb2c7" podUID="c9a5b258-9d31-4031-85f0-1c8d00da3dda" containerName="registry-server" containerID="cri-o://e090310451bfa0ea474a10a9ee80aac36797337db2c5a79361cb32bef9c0d9aa" gracePeriod=2 Jan 21 15:03:53 crc kubenswrapper[4720]: I0121 15:03:53.031516 4720 generic.go:334] "Generic (PLEG): container finished" podID="c9a5b258-9d31-4031-85f0-1c8d00da3dda" containerID="e090310451bfa0ea474a10a9ee80aac36797337db2c5a79361cb32bef9c0d9aa" exitCode=0 Jan 21 15:03:53 crc kubenswrapper[4720]: I0121 15:03:53.031796 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kb2c7" event={"ID":"c9a5b258-9d31-4031-85f0-1c8d00da3dda","Type":"ContainerDied","Data":"e090310451bfa0ea474a10a9ee80aac36797337db2c5a79361cb32bef9c0d9aa"} Jan 21 15:03:53 crc kubenswrapper[4720]: I0121 15:03:53.912815 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kb2c7" Jan 21 15:03:53 crc kubenswrapper[4720]: I0121 15:03:53.934175 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9a5b258-9d31-4031-85f0-1c8d00da3dda-catalog-content\") pod \"c9a5b258-9d31-4031-85f0-1c8d00da3dda\" (UID: \"c9a5b258-9d31-4031-85f0-1c8d00da3dda\") " Jan 21 15:03:53 crc kubenswrapper[4720]: I0121 15:03:53.934295 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9a5b258-9d31-4031-85f0-1c8d00da3dda-utilities\") pod \"c9a5b258-9d31-4031-85f0-1c8d00da3dda\" (UID: \"c9a5b258-9d31-4031-85f0-1c8d00da3dda\") " Jan 21 15:03:53 crc kubenswrapper[4720]: I0121 15:03:53.934332 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfcfg\" (UniqueName: \"kubernetes.io/projected/c9a5b258-9d31-4031-85f0-1c8d00da3dda-kube-api-access-rfcfg\") pod \"c9a5b258-9d31-4031-85f0-1c8d00da3dda\" (UID: \"c9a5b258-9d31-4031-85f0-1c8d00da3dda\") " Jan 21 15:03:53 crc kubenswrapper[4720]: I0121 15:03:53.937076 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9a5b258-9d31-4031-85f0-1c8d00da3dda-utilities" (OuterVolumeSpecName: "utilities") pod "c9a5b258-9d31-4031-85f0-1c8d00da3dda" (UID: "c9a5b258-9d31-4031-85f0-1c8d00da3dda"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:03:53 crc kubenswrapper[4720]: I0121 15:03:53.943596 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9a5b258-9d31-4031-85f0-1c8d00da3dda-kube-api-access-rfcfg" (OuterVolumeSpecName: "kube-api-access-rfcfg") pod "c9a5b258-9d31-4031-85f0-1c8d00da3dda" (UID: "c9a5b258-9d31-4031-85f0-1c8d00da3dda"). InnerVolumeSpecName "kube-api-access-rfcfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:03:54 crc kubenswrapper[4720]: I0121 15:03:54.034391 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9a5b258-9d31-4031-85f0-1c8d00da3dda-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9a5b258-9d31-4031-85f0-1c8d00da3dda" (UID: "c9a5b258-9d31-4031-85f0-1c8d00da3dda"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:03:54 crc kubenswrapper[4720]: I0121 15:03:54.036420 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9a5b258-9d31-4031-85f0-1c8d00da3dda-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:03:54 crc kubenswrapper[4720]: I0121 15:03:54.036498 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfcfg\" (UniqueName: \"kubernetes.io/projected/c9a5b258-9d31-4031-85f0-1c8d00da3dda-kube-api-access-rfcfg\") on node \"crc\" DevicePath \"\"" Jan 21 15:03:54 crc kubenswrapper[4720]: I0121 15:03:54.036568 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9a5b258-9d31-4031-85f0-1c8d00da3dda-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:03:54 crc kubenswrapper[4720]: I0121 15:03:54.047540 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kb2c7" Jan 21 15:03:54 crc kubenswrapper[4720]: I0121 15:03:54.048617 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kb2c7" event={"ID":"c9a5b258-9d31-4031-85f0-1c8d00da3dda","Type":"ContainerDied","Data":"802779e03cb8d5a94886d5052622d2f225df3c745fbe9ca0d9b7f323d0685420"} Jan 21 15:03:54 crc kubenswrapper[4720]: I0121 15:03:54.048691 4720 scope.go:117] "RemoveContainer" containerID="e090310451bfa0ea474a10a9ee80aac36797337db2c5a79361cb32bef9c0d9aa" Jan 21 15:03:54 crc kubenswrapper[4720]: I0121 15:03:54.096482 4720 scope.go:117] "RemoveContainer" containerID="c6d0ea8c2e2121a74778b256a77f4b032d4f796cda4fbfab99f77a84e3288124" Jan 21 15:03:54 crc kubenswrapper[4720]: I0121 15:03:54.098386 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kb2c7"] Jan 21 15:03:54 crc kubenswrapper[4720]: I0121 15:03:54.107975 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kb2c7"] Jan 21 15:03:54 crc kubenswrapper[4720]: I0121 15:03:54.139083 4720 scope.go:117] "RemoveContainer" containerID="43eba3433cb18996557abdfca43416ddb338165d69b1ca200a34d85ce638dbbb" Jan 21 15:03:54 crc kubenswrapper[4720]: I0121 15:03:54.686874 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9a5b258-9d31-4031-85f0-1c8d00da3dda" path="/var/lib/kubelet/pods/c9a5b258-9d31-4031-85f0-1c8d00da3dda/volumes" Jan 21 15:04:33 crc kubenswrapper[4720]: I0121 15:04:33.928594 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-72sfn_51379103-8c08-45c6-a0f3-86928d43bd50/controller/0.log" Jan 21 15:04:33 crc kubenswrapper[4720]: I0121 15:04:33.938510 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-72sfn_51379103-8c08-45c6-a0f3-86928d43bd50/kube-rbac-proxy/0.log" Jan 21 15:04:33 crc kubenswrapper[4720]: I0121 15:04:33.954690 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/controller/0.log" Jan 21 15:04:34 crc kubenswrapper[4720]: I0121 15:04:34.085210 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-d6jp2_4eec0898-8a1a-47d9-ac37-62cfe6c7b857/cert-manager-controller/0.log" Jan 21 15:04:34 crc kubenswrapper[4720]: I0121 15:04:34.102535 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-c4tn5_4939bfdd-b3b4-4850-8b5d-3399548ad5a0/cert-manager-cainjector/0.log" Jan 21 15:04:34 crc kubenswrapper[4720]: I0121 15:04:34.132452 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-vflwv_0236eaa4-e5d8-4699-82f8-1e9648f95dc8/cert-manager-webhook/0.log" Jan 21 15:04:34 crc kubenswrapper[4720]: I0121 15:04:34.934306 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/frr/0.log" Jan 21 15:04:34 crc kubenswrapper[4720]: I0121 15:04:34.944418 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/reloader/0.log" Jan 21 15:04:34 crc kubenswrapper[4720]: I0121 15:04:34.950717 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/frr-metrics/0.log" Jan 21 15:04:34 crc kubenswrapper[4720]: I0121 15:04:34.959580 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/kube-rbac-proxy/0.log" Jan 21 15:04:34 crc kubenswrapper[4720]: I0121 15:04:34.968335 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/kube-rbac-proxy-frr/0.log" Jan 21 15:04:34 crc kubenswrapper[4720]: I0121 15:04:34.975559 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/cp-frr-files/0.log" Jan 21 15:04:34 crc kubenswrapper[4720]: I0121 15:04:34.982678 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/cp-reloader/0.log" Jan 21 15:04:34 crc kubenswrapper[4720]: I0121 15:04:34.989112 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/cp-metrics/0.log" Jan 21 15:04:34 crc kubenswrapper[4720]: I0121 15:04:34.998739 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-lsrs9_8ba45f1e-4559-4408-b129-b061d406fce6/frr-k8s-webhook-server/0.log" Jan 21 15:04:35 crc kubenswrapper[4720]: I0121 15:04:35.021249 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7b8c8cff46-cbv67_b6fdd799-fe82-4cd7-b825-c755b6189180/manager/0.log" Jan 21 15:04:35 crc kubenswrapper[4720]: I0121 15:04:35.037825 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-75df998c5f-tnbdz_6c334ce5-b6c7-40c8-a261-5a5084ae3db8/webhook-server/0.log" Jan 21 15:04:35 crc kubenswrapper[4720]: I0121 15:04:35.353719 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-m7fv6_49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1/speaker/0.log" Jan 21 15:04:35 crc kubenswrapper[4720]: I0121 15:04:35.370126 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-m7fv6_49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1/kube-rbac-proxy/0.log" Jan 21 15:04:35 crc kubenswrapper[4720]: I0121 15:04:35.397923 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7ddb5c749-q2t2m_655f8c6a-4936-45d3-9538-66ee77a050d3/manager/0.log" Jan 21 15:04:35 crc kubenswrapper[4720]: I0121 15:04:35.412123 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g_533f904c-bfa5-42e7-a907-5fe372443d20/extract/0.log" Jan 21 15:04:35 crc kubenswrapper[4720]: I0121 15:04:35.424307 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g_533f904c-bfa5-42e7-a907-5fe372443d20/util/0.log" Jan 21 15:04:35 crc kubenswrapper[4720]: I0121 15:04:35.429480 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g_533f904c-bfa5-42e7-a907-5fe372443d20/pull/0.log" Jan 21 15:04:35 crc kubenswrapper[4720]: I0121 15:04:35.476315 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-9b68f5989-wnzfm_b7ea6739-9c38-44a0-a382-8b26e37138fa/manager/0.log" Jan 21 15:04:35 crc kubenswrapper[4720]: I0121 15:04:35.498547 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-9f958b845-bjn2r_96218341-1cf7-4aa1-bb9a-7a7abba7a93e/manager/0.log" Jan 21 15:04:35 crc kubenswrapper[4720]: I0121 15:04:35.546534 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-c6994669c-gwlgm_6c93648a-7076-4d91-ac7a-f389ab1159cc/manager/0.log" Jan 21 15:04:35 crc kubenswrapper[4720]: I0121 15:04:35.556428 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-594c8c9d5d-bl4z8_9a5569f7-371f-4663-b005-5fdcce36936b/manager/0.log" Jan 21 15:04:35 crc kubenswrapper[4720]: I0121 15:04:35.565241 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-77d5c5b54f-vfxfh_071d4469-5b09-49a3-97f4-239d811825a2/manager/0.log" Jan 21 15:04:35 crc kubenswrapper[4720]: I0121 15:04:35.771118 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-77c48c7859-xtpbn_b80cffaf-5853-47ac-b783-c26da64425ff/manager/0.log" Jan 21 15:04:35 crc kubenswrapper[4720]: I0121 15:04:35.782677 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-78757b4889-glbt4_9b467fa8-1984-4659-8873-99c20204b16b/manager/0.log" Jan 21 15:04:35 crc kubenswrapper[4720]: I0121 15:04:35.835203 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-767fdc4f47-54hwg_085a2e93-1496-47f3-a7dc-4acae2e201fc/manager/0.log" Jan 21 15:04:35 crc kubenswrapper[4720]: I0121 15:04:35.846463 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-864f6b75bf-n5bwd_370e5a87-5edf-4d48-9b65-335400a84cd2/manager/0.log" Jan 21 15:04:35 crc kubenswrapper[4720]: I0121 15:04:35.877047 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-c87fff755-v4fbm_589a442f-27a6-4d23-85dd-9e5b1556363f/manager/0.log" Jan 21 15:04:35 crc kubenswrapper[4720]: I0121 15:04:35.925406 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-cb4666565-d22bk_c38df2a4-6626-4b71-9dcd-7ef3003ee693/manager/0.log" Jan 21 15:04:36 crc kubenswrapper[4720]: I0121 15:04:36.006974 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-65849867d6-vzzmp_bb4052fb-6d6d-4e2a-b2f7-ee94c97512b5/manager/0.log" Jan 21 15:04:36 crc kubenswrapper[4720]: I0121 15:04:36.025496 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7fc9b76cf6-pw4z6_9695fd09-d135-426b-a129-66f945d2dd90/manager/0.log" Jan 21 15:04:36 crc kubenswrapper[4720]: I0121 15:04:36.040153 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw_88e81fdb-6501-410c-9452-d3ba7f41a30d/manager/0.log" Jan 21 15:04:36 crc kubenswrapper[4720]: I0121 15:04:36.183134 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-68fc899677-pbmmn_d3800217-b53a-4788-a9d4-8861cfdb68a1/operator/0.log" Jan 21 15:04:36 crc kubenswrapper[4720]: I0121 15:04:36.731495 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-d6jp2_4eec0898-8a1a-47d9-ac37-62cfe6c7b857/cert-manager-controller/0.log" Jan 21 15:04:36 crc kubenswrapper[4720]: I0121 15:04:36.767884 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-c4tn5_4939bfdd-b3b4-4850-8b5d-3399548ad5a0/cert-manager-cainjector/0.log" Jan 21 15:04:36 crc kubenswrapper[4720]: I0121 15:04:36.778307 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-vflwv_0236eaa4-e5d8-4699-82f8-1e9648f95dc8/cert-manager-webhook/0.log" Jan 21 15:04:37 crc kubenswrapper[4720]: I0121 15:04:37.027112 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-d47656bc9-4hjmr_eb81b686-832a-414b-aa66-cf40a72a7427/manager/0.log" Jan 21 15:04:37 crc kubenswrapper[4720]: I0121 15:04:37.044951 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-j4xn9_5d59157d-f538-4cb0-959d-11584d7678c5/registry-server/0.log" Jan 21 15:04:37 crc kubenswrapper[4720]: I0121 15:04:37.116688 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-55db956ddc-689zh_88327b24-ce00-4bb4-98d1-24060c6dbf28/manager/0.log" Jan 21 15:04:37 crc kubenswrapper[4720]: I0121 15:04:37.154019 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-686df47fcb-2clln_18ce7f0d-00de-4a92-97f2-743d9057abff/manager/0.log" Jan 21 15:04:37 crc kubenswrapper[4720]: I0121 15:04:37.181006 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-mm7cg_8db4bced-5679-43ab-a5c9-ba87574aaa02/operator/0.log" Jan 21 15:04:37 crc kubenswrapper[4720]: I0121 15:04:37.194108 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-85dd56d4cc-4tjlt_a2557af5-c155-4d37-9b9a-f9335cac47b1/manager/0.log" Jan 21 15:04:37 crc kubenswrapper[4720]: I0121 15:04:37.242248 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5f8f495fcf-8hrkh_a050e31c-3d6d-490c-8f74-637c37c96a5e/manager/0.log" Jan 21 15:04:37 crc kubenswrapper[4720]: I0121 15:04:37.253353 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7cd8bc9dbb-xczlv_cd17e86c-5586-4ea9-979d-2c195494fe99/manager/0.log" Jan 21 15:04:37 crc kubenswrapper[4720]: I0121 15:04:37.267259 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-64cd966744-jfkfq_de2e9655-961c-4250-9852-332dfe335b4a/manager/0.log" Jan 21 15:04:37 crc kubenswrapper[4720]: I0121 15:04:37.820987 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-jtj6g_48af697e-308a-4bdd-a5d8-d86cd5c4fb0c/control-plane-machine-set-operator/0.log" Jan 21 15:04:37 crc kubenswrapper[4720]: I0121 15:04:37.834801 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-h9ckd_1a75d5de-a507-41ca-8206-eae702d16020/kube-rbac-proxy/0.log" Jan 21 15:04:37 crc kubenswrapper[4720]: I0121 15:04:37.844148 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-h9ckd_1a75d5de-a507-41ca-8206-eae702d16020/machine-api-operator/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.031295 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7ddb5c749-q2t2m_655f8c6a-4936-45d3-9538-66ee77a050d3/manager/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.040624 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g_533f904c-bfa5-42e7-a907-5fe372443d20/extract/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.048151 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g_533f904c-bfa5-42e7-a907-5fe372443d20/util/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.057702 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g_533f904c-bfa5-42e7-a907-5fe372443d20/pull/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.096087 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-9b68f5989-wnzfm_b7ea6739-9c38-44a0-a382-8b26e37138fa/manager/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.107096 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-9f958b845-bjn2r_96218341-1cf7-4aa1-bb9a-7a7abba7a93e/manager/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.155457 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-c6994669c-gwlgm_6c93648a-7076-4d91-ac7a-f389ab1159cc/manager/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.165184 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-594c8c9d5d-bl4z8_9a5569f7-371f-4663-b005-5fdcce36936b/manager/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.177322 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-77d5c5b54f-vfxfh_071d4469-5b09-49a3-97f4-239d811825a2/manager/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.411098 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-f9sxz_e3d11ff0-1741-4f0d-aa50-6e0144e843a6/nmstate-console-plugin/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.412008 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-77c48c7859-xtpbn_b80cffaf-5853-47ac-b783-c26da64425ff/manager/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.428391 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-l74mh_da16493b-aa03-4556-b3ce-d87ccfdbba70/nmstate-handler/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.429086 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-78757b4889-glbt4_9b467fa8-1984-4659-8873-99c20204b16b/manager/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.444484 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-j9dxt_a26c9332-5a74-49a3-8347-45ae67cb1c90/nmstate-metrics/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.454272 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-j9dxt_a26c9332-5a74-49a3-8347-45ae67cb1c90/kube-rbac-proxy/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.477250 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-mclmr_2bdd7be0-b9cf-4501-9816-87831d74becc/nmstate-operator/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.486520 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-xcckr_c338dc84-0c3a-44c4-8f08-82001f532c2b/nmstate-webhook/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.489024 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-767fdc4f47-54hwg_085a2e93-1496-47f3-a7dc-4acae2e201fc/manager/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.499107 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-864f6b75bf-n5bwd_370e5a87-5edf-4d48-9b65-335400a84cd2/manager/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.537378 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-c87fff755-v4fbm_589a442f-27a6-4d23-85dd-9e5b1556363f/manager/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.580510 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-cb4666565-d22bk_c38df2a4-6626-4b71-9dcd-7ef3003ee693/manager/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.657593 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-65849867d6-vzzmp_bb4052fb-6d6d-4e2a-b2f7-ee94c97512b5/manager/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.666814 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7fc9b76cf6-pw4z6_9695fd09-d135-426b-a129-66f945d2dd90/manager/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.688454 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw_88e81fdb-6501-410c-9452-d3ba7f41a30d/manager/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.831510 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-68fc899677-pbmmn_d3800217-b53a-4788-a9d4-8861cfdb68a1/operator/0.log" Jan 21 15:04:40 crc kubenswrapper[4720]: I0121 15:04:40.692053 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-d47656bc9-4hjmr_eb81b686-832a-414b-aa66-cf40a72a7427/manager/0.log" Jan 21 15:04:40 crc kubenswrapper[4720]: I0121 15:04:40.703609 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-j4xn9_5d59157d-f538-4cb0-959d-11584d7678c5/registry-server/0.log" Jan 21 15:04:40 crc kubenswrapper[4720]: I0121 15:04:40.755903 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-55db956ddc-689zh_88327b24-ce00-4bb4-98d1-24060c6dbf28/manager/0.log" Jan 21 15:04:40 crc kubenswrapper[4720]: I0121 15:04:40.784735 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-686df47fcb-2clln_18ce7f0d-00de-4a92-97f2-743d9057abff/manager/0.log" Jan 21 15:04:40 crc kubenswrapper[4720]: I0121 15:04:40.806776 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-mm7cg_8db4bced-5679-43ab-a5c9-ba87574aaa02/operator/0.log" Jan 21 15:04:40 crc kubenswrapper[4720]: I0121 15:04:40.824968 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-85dd56d4cc-4tjlt_a2557af5-c155-4d37-9b9a-f9335cac47b1/manager/0.log" Jan 21 15:04:40 crc kubenswrapper[4720]: I0121 15:04:40.896104 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5f8f495fcf-8hrkh_a050e31c-3d6d-490c-8f74-637c37c96a5e/manager/0.log" Jan 21 15:04:40 crc kubenswrapper[4720]: I0121 15:04:40.904686 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7cd8bc9dbb-xczlv_cd17e86c-5586-4ea9-979d-2c195494fe99/manager/0.log" Jan 21 15:04:40 crc kubenswrapper[4720]: I0121 15:04:40.914457 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-64cd966744-jfkfq_de2e9655-961c-4250-9852-332dfe335b4a/manager/0.log" Jan 21 15:04:43 crc kubenswrapper[4720]: I0121 15:04:43.034344 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5r9wf_14cdc412-e60b-4b9b-b37d-33b1f061f44d/kube-multus-additional-cni-plugins/0.log" Jan 21 15:04:43 crc kubenswrapper[4720]: I0121 15:04:43.043589 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5r9wf_14cdc412-e60b-4b9b-b37d-33b1f061f44d/egress-router-binary-copy/0.log" Jan 21 15:04:43 crc kubenswrapper[4720]: I0121 15:04:43.053307 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5r9wf_14cdc412-e60b-4b9b-b37d-33b1f061f44d/cni-plugins/0.log" Jan 21 15:04:43 crc kubenswrapper[4720]: I0121 15:04:43.060495 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5r9wf_14cdc412-e60b-4b9b-b37d-33b1f061f44d/bond-cni-plugin/0.log" Jan 21 15:04:43 crc kubenswrapper[4720]: I0121 15:04:43.067567 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5r9wf_14cdc412-e60b-4b9b-b37d-33b1f061f44d/routeoverride-cni/0.log" Jan 21 15:04:43 crc kubenswrapper[4720]: I0121 15:04:43.077898 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5r9wf_14cdc412-e60b-4b9b-b37d-33b1f061f44d/whereabouts-cni-bincopy/0.log" Jan 21 15:04:43 crc kubenswrapper[4720]: I0121 15:04:43.086423 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5r9wf_14cdc412-e60b-4b9b-b37d-33b1f061f44d/whereabouts-cni/0.log" Jan 21 15:04:43 crc kubenswrapper[4720]: I0121 15:04:43.113599 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-857f4d67dd-7mfnf_92d3c944-8def-4f95-a3cb-781f929f5f28/multus-admission-controller/0.log" Jan 21 15:04:43 crc kubenswrapper[4720]: I0121 15:04:43.119101 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-857f4d67dd-7mfnf_92d3c944-8def-4f95-a3cb-781f929f5f28/kube-rbac-proxy/0.log" Jan 21 15:04:43 crc kubenswrapper[4720]: I0121 15:04:43.175244 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-w85dm_a40805c6-ef8a-4ae0-bb5b-1834d257e8c6/kube-multus/0.log" Jan 21 15:04:43 crc kubenswrapper[4720]: I0121 15:04:43.212601 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-w85dm_a40805c6-ef8a-4ae0-bb5b-1834d257e8c6/kube-multus/1.log" Jan 21 15:04:43 crc kubenswrapper[4720]: I0121 15:04:43.241714 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-x48m6_139c8416-e015-49e4-adfe-32f9e142621f/network-metrics-daemon/0.log" Jan 21 15:04:43 crc kubenswrapper[4720]: I0121 15:04:43.246813 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-x48m6_139c8416-e015-49e4-adfe-32f9e142621f/kube-rbac-proxy/0.log" Jan 21 15:04:52 crc kubenswrapper[4720]: I0121 15:04:52.880443 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:04:52 crc kubenswrapper[4720]: I0121 15:04:52.880967 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:05:22 crc kubenswrapper[4720]: I0121 15:05:22.880304 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:05:22 crc kubenswrapper[4720]: I0121 15:05:22.881724 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:05:52 crc kubenswrapper[4720]: I0121 15:05:52.879543 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:05:52 crc kubenswrapper[4720]: I0121 15:05:52.879982 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:05:52 crc kubenswrapper[4720]: I0121 15:05:52.880017 4720 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 15:05:52 crc kubenswrapper[4720]: I0121 15:05:52.880535 4720 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3bee635cc2c3c335bc129c259a16f2476ca04810986abca3de29789dac0840b7"} pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 15:05:52 crc kubenswrapper[4720]: I0121 15:05:52.880575 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" containerID="cri-o://3bee635cc2c3c335bc129c259a16f2476ca04810986abca3de29789dac0840b7" gracePeriod=600 Jan 21 15:05:54 crc kubenswrapper[4720]: I0121 15:05:54.036699 4720 generic.go:334] "Generic (PLEG): container finished" podID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerID="3bee635cc2c3c335bc129c259a16f2476ca04810986abca3de29789dac0840b7" exitCode=0 Jan 21 15:05:54 crc kubenswrapper[4720]: I0121 15:05:54.036779 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerDied","Data":"3bee635cc2c3c335bc129c259a16f2476ca04810986abca3de29789dac0840b7"} Jan 21 15:05:54 crc kubenswrapper[4720]: I0121 15:05:54.037261 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerStarted","Data":"9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0"} Jan 21 15:05:54 crc kubenswrapper[4720]: I0121 15:05:54.037286 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 15:08:00 crc kubenswrapper[4720]: I0121 15:08:00.715053 4720 scope.go:117] "RemoveContainer" containerID="a41cd8196de8dca42371cf925db6d045de3d0cbd2f7f8353d4af3ee985a4735d" Jan 21 15:08:22 crc kubenswrapper[4720]: I0121 15:08:22.880289 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:08:22 crc kubenswrapper[4720]: I0121 15:08:22.880937 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:08:52 crc kubenswrapper[4720]: I0121 15:08:52.880551 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:08:52 crc kubenswrapper[4720]: I0121 15:08:52.881131 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:09:22 crc kubenswrapper[4720]: I0121 15:09:22.879915 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:09:22 crc kubenswrapper[4720]: I0121 15:09:22.880818 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:09:22 crc kubenswrapper[4720]: I0121 15:09:22.880912 4720 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 15:09:22 crc kubenswrapper[4720]: I0121 15:09:22.882290 4720 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0"} pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 15:09:22 crc kubenswrapper[4720]: I0121 15:09:22.882899 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" containerID="cri-o://9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" gracePeriod=600 Jan 21 15:09:23 crc kubenswrapper[4720]: E0121 15:09:23.016597 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:09:23 crc kubenswrapper[4720]: I0121 15:09:23.837718 4720 generic.go:334] "Generic (PLEG): container finished" podID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" exitCode=0 Jan 21 15:09:23 crc kubenswrapper[4720]: I0121 15:09:23.837753 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerDied","Data":"9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0"} Jan 21 15:09:23 crc kubenswrapper[4720]: I0121 15:09:23.837799 4720 scope.go:117] "RemoveContainer" containerID="3bee635cc2c3c335bc129c259a16f2476ca04810986abca3de29789dac0840b7" Jan 21 15:09:23 crc kubenswrapper[4720]: I0121 15:09:23.838449 4720 scope.go:117] "RemoveContainer" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" Jan 21 15:09:23 crc kubenswrapper[4720]: E0121 15:09:23.838758 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:09:36 crc kubenswrapper[4720]: I0121 15:09:36.678711 4720 scope.go:117] "RemoveContainer" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" Jan 21 15:09:36 crc kubenswrapper[4720]: E0121 15:09:36.679968 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:09:47 crc kubenswrapper[4720]: I0121 15:09:47.677991 4720 scope.go:117] "RemoveContainer" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" Jan 21 15:09:47 crc kubenswrapper[4720]: E0121 15:09:47.678680 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:10:00 crc kubenswrapper[4720]: I0121 15:10:00.678009 4720 scope.go:117] "RemoveContainer" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" Jan 21 15:10:00 crc kubenswrapper[4720]: E0121 15:10:00.678768 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:10:13 crc kubenswrapper[4720]: I0121 15:10:13.678412 4720 scope.go:117] "RemoveContainer" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" Jan 21 15:10:13 crc kubenswrapper[4720]: E0121 15:10:13.679162 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:10:24 crc kubenswrapper[4720]: I0121 15:10:24.678816 4720 scope.go:117] "RemoveContainer" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" Jan 21 15:10:24 crc kubenswrapper[4720]: E0121 15:10:24.679625 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:10:39 crc kubenswrapper[4720]: I0121 15:10:39.678100 4720 scope.go:117] "RemoveContainer" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" Jan 21 15:10:39 crc kubenswrapper[4720]: E0121 15:10:39.678969 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:10:52 crc kubenswrapper[4720]: I0121 15:10:52.691236 4720 scope.go:117] "RemoveContainer" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" Jan 21 15:10:52 crc kubenswrapper[4720]: E0121 15:10:52.692118 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:11:06 crc kubenswrapper[4720]: I0121 15:11:06.678223 4720 scope.go:117] "RemoveContainer" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" Jan 21 15:11:06 crc kubenswrapper[4720]: E0121 15:11:06.679781 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:11:20 crc kubenswrapper[4720]: I0121 15:11:20.681468 4720 scope.go:117] "RemoveContainer" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" Jan 21 15:11:20 crc kubenswrapper[4720]: E0121 15:11:20.682370 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:11:23 crc kubenswrapper[4720]: I0121 15:11:23.257008 4720 generic.go:334] "Generic (PLEG): container finished" podID="32ba91fa-9395-4dae-8bf6-384541b2d3ed" containerID="1c87134fcc0ac7700d916ae3b483f047e49613bd0b9fd19a14ad4f58b8e5db77" exitCode=0 Jan 21 15:11:23 crc kubenswrapper[4720]: I0121 15:11:23.257199 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ltcrl/must-gather-kz6gt" event={"ID":"32ba91fa-9395-4dae-8bf6-384541b2d3ed","Type":"ContainerDied","Data":"1c87134fcc0ac7700d916ae3b483f047e49613bd0b9fd19a14ad4f58b8e5db77"} Jan 21 15:11:23 crc kubenswrapper[4720]: I0121 15:11:23.257758 4720 scope.go:117] "RemoveContainer" containerID="1c87134fcc0ac7700d916ae3b483f047e49613bd0b9fd19a14ad4f58b8e5db77" Jan 21 15:11:23 crc kubenswrapper[4720]: I0121 15:11:23.589885 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ltcrl_must-gather-kz6gt_32ba91fa-9395-4dae-8bf6-384541b2d3ed/gather/0.log" Jan 21 15:11:31 crc kubenswrapper[4720]: I0121 15:11:31.985930 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ltcrl/must-gather-kz6gt"] Jan 21 15:11:31 crc kubenswrapper[4720]: I0121 15:11:31.986652 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-ltcrl/must-gather-kz6gt" podUID="32ba91fa-9395-4dae-8bf6-384541b2d3ed" containerName="copy" containerID="cri-o://c33cd97c026b015df83cc6f96e3b1b70f009b429e334e4945f3e1e3052d31932" gracePeriod=2 Jan 21 15:11:31 crc kubenswrapper[4720]: I0121 15:11:31.993106 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ltcrl/must-gather-kz6gt"] Jan 21 15:11:32 crc kubenswrapper[4720]: I0121 15:11:32.342003 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ltcrl_must-gather-kz6gt_32ba91fa-9395-4dae-8bf6-384541b2d3ed/copy/0.log" Jan 21 15:11:32 crc kubenswrapper[4720]: I0121 15:11:32.342953 4720 generic.go:334] "Generic (PLEG): container finished" podID="32ba91fa-9395-4dae-8bf6-384541b2d3ed" containerID="c33cd97c026b015df83cc6f96e3b1b70f009b429e334e4945f3e1e3052d31932" exitCode=143 Jan 21 15:11:32 crc kubenswrapper[4720]: I0121 15:11:32.961891 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ltcrl_must-gather-kz6gt_32ba91fa-9395-4dae-8bf6-384541b2d3ed/copy/0.log" Jan 21 15:11:32 crc kubenswrapper[4720]: I0121 15:11:32.962560 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ltcrl/must-gather-kz6gt" Jan 21 15:11:33 crc kubenswrapper[4720]: I0121 15:11:33.035023 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/32ba91fa-9395-4dae-8bf6-384541b2d3ed-must-gather-output\") pod \"32ba91fa-9395-4dae-8bf6-384541b2d3ed\" (UID: \"32ba91fa-9395-4dae-8bf6-384541b2d3ed\") " Jan 21 15:11:33 crc kubenswrapper[4720]: I0121 15:11:33.036326 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvtcx\" (UniqueName: \"kubernetes.io/projected/32ba91fa-9395-4dae-8bf6-384541b2d3ed-kube-api-access-mvtcx\") pod \"32ba91fa-9395-4dae-8bf6-384541b2d3ed\" (UID: \"32ba91fa-9395-4dae-8bf6-384541b2d3ed\") " Jan 21 15:11:33 crc kubenswrapper[4720]: I0121 15:11:33.042731 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32ba91fa-9395-4dae-8bf6-384541b2d3ed-kube-api-access-mvtcx" (OuterVolumeSpecName: "kube-api-access-mvtcx") pod "32ba91fa-9395-4dae-8bf6-384541b2d3ed" (UID: "32ba91fa-9395-4dae-8bf6-384541b2d3ed"). InnerVolumeSpecName "kube-api-access-mvtcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:11:33 crc kubenswrapper[4720]: I0121 15:11:33.138547 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvtcx\" (UniqueName: \"kubernetes.io/projected/32ba91fa-9395-4dae-8bf6-384541b2d3ed-kube-api-access-mvtcx\") on node \"crc\" DevicePath \"\"" Jan 21 15:11:33 crc kubenswrapper[4720]: I0121 15:11:33.238850 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32ba91fa-9395-4dae-8bf6-384541b2d3ed-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "32ba91fa-9395-4dae-8bf6-384541b2d3ed" (UID: "32ba91fa-9395-4dae-8bf6-384541b2d3ed"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:11:33 crc kubenswrapper[4720]: I0121 15:11:33.240233 4720 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/32ba91fa-9395-4dae-8bf6-384541b2d3ed-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 21 15:11:33 crc kubenswrapper[4720]: I0121 15:11:33.362641 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ltcrl_must-gather-kz6gt_32ba91fa-9395-4dae-8bf6-384541b2d3ed/copy/0.log" Jan 21 15:11:33 crc kubenswrapper[4720]: I0121 15:11:33.362981 4720 scope.go:117] "RemoveContainer" containerID="c33cd97c026b015df83cc6f96e3b1b70f009b429e334e4945f3e1e3052d31932" Jan 21 15:11:33 crc kubenswrapper[4720]: I0121 15:11:33.363122 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ltcrl/must-gather-kz6gt" Jan 21 15:11:33 crc kubenswrapper[4720]: I0121 15:11:33.385790 4720 scope.go:117] "RemoveContainer" containerID="1c87134fcc0ac7700d916ae3b483f047e49613bd0b9fd19a14ad4f58b8e5db77" Jan 21 15:11:34 crc kubenswrapper[4720]: I0121 15:11:34.679155 4720 scope.go:117] "RemoveContainer" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" Jan 21 15:11:34 crc kubenswrapper[4720]: E0121 15:11:34.679842 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:11:34 crc kubenswrapper[4720]: I0121 15:11:34.687923 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32ba91fa-9395-4dae-8bf6-384541b2d3ed" path="/var/lib/kubelet/pods/32ba91fa-9395-4dae-8bf6-384541b2d3ed/volumes" Jan 21 15:11:49 crc kubenswrapper[4720]: I0121 15:11:49.678127 4720 scope.go:117] "RemoveContainer" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" Jan 21 15:11:49 crc kubenswrapper[4720]: E0121 15:11:49.678825 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:12:03 crc kubenswrapper[4720]: I0121 15:12:03.678144 4720 scope.go:117] "RemoveContainer" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" Jan 21 15:12:03 crc kubenswrapper[4720]: E0121 15:12:03.678900 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:12:18 crc kubenswrapper[4720]: I0121 15:12:18.678516 4720 scope.go:117] "RemoveContainer" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" Jan 21 15:12:18 crc kubenswrapper[4720]: E0121 15:12:18.679406 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.398021 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bpd4p"] Jan 21 15:12:30 crc kubenswrapper[4720]: E0121 15:12:30.398892 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4" containerName="extract-utilities" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.398903 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4" containerName="extract-utilities" Jan 21 15:12:30 crc kubenswrapper[4720]: E0121 15:12:30.398916 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32ba91fa-9395-4dae-8bf6-384541b2d3ed" containerName="gather" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.398922 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="32ba91fa-9395-4dae-8bf6-384541b2d3ed" containerName="gather" Jan 21 15:12:30 crc kubenswrapper[4720]: E0121 15:12:30.398932 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9a5b258-9d31-4031-85f0-1c8d00da3dda" containerName="extract-utilities" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.398938 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9a5b258-9d31-4031-85f0-1c8d00da3dda" containerName="extract-utilities" Jan 21 15:12:30 crc kubenswrapper[4720]: E0121 15:12:30.398948 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32ba91fa-9395-4dae-8bf6-384541b2d3ed" containerName="copy" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.398953 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="32ba91fa-9395-4dae-8bf6-384541b2d3ed" containerName="copy" Jan 21 15:12:30 crc kubenswrapper[4720]: E0121 15:12:30.398965 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9a5b258-9d31-4031-85f0-1c8d00da3dda" containerName="registry-server" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.398971 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9a5b258-9d31-4031-85f0-1c8d00da3dda" containerName="registry-server" Jan 21 15:12:30 crc kubenswrapper[4720]: E0121 15:12:30.398989 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9a5b258-9d31-4031-85f0-1c8d00da3dda" containerName="extract-content" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.398997 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9a5b258-9d31-4031-85f0-1c8d00da3dda" containerName="extract-content" Jan 21 15:12:30 crc kubenswrapper[4720]: E0121 15:12:30.399010 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4" containerName="extract-content" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.399016 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4" containerName="extract-content" Jan 21 15:12:30 crc kubenswrapper[4720]: E0121 15:12:30.399027 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4" containerName="registry-server" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.399034 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4" containerName="registry-server" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.399175 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9a5b258-9d31-4031-85f0-1c8d00da3dda" containerName="registry-server" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.399186 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="32ba91fa-9395-4dae-8bf6-384541b2d3ed" containerName="gather" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.399208 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="32ba91fa-9395-4dae-8bf6-384541b2d3ed" containerName="copy" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.399216 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4" containerName="registry-server" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.400488 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bpd4p" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.406380 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bpd4p"] Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.437417 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc75242e-0455-42d2-9539-105eceed64f5-catalog-content\") pod \"redhat-operators-bpd4p\" (UID: \"fc75242e-0455-42d2-9539-105eceed64f5\") " pod="openshift-marketplace/redhat-operators-bpd4p" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.437490 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc75242e-0455-42d2-9539-105eceed64f5-utilities\") pod \"redhat-operators-bpd4p\" (UID: \"fc75242e-0455-42d2-9539-105eceed64f5\") " pod="openshift-marketplace/redhat-operators-bpd4p" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.437543 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fms52\" (UniqueName: \"kubernetes.io/projected/fc75242e-0455-42d2-9539-105eceed64f5-kube-api-access-fms52\") pod \"redhat-operators-bpd4p\" (UID: \"fc75242e-0455-42d2-9539-105eceed64f5\") " pod="openshift-marketplace/redhat-operators-bpd4p" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.540079 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc75242e-0455-42d2-9539-105eceed64f5-catalog-content\") pod \"redhat-operators-bpd4p\" (UID: \"fc75242e-0455-42d2-9539-105eceed64f5\") " pod="openshift-marketplace/redhat-operators-bpd4p" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.540148 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc75242e-0455-42d2-9539-105eceed64f5-utilities\") pod \"redhat-operators-bpd4p\" (UID: \"fc75242e-0455-42d2-9539-105eceed64f5\") " pod="openshift-marketplace/redhat-operators-bpd4p" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.540211 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fms52\" (UniqueName: \"kubernetes.io/projected/fc75242e-0455-42d2-9539-105eceed64f5-kube-api-access-fms52\") pod \"redhat-operators-bpd4p\" (UID: \"fc75242e-0455-42d2-9539-105eceed64f5\") " pod="openshift-marketplace/redhat-operators-bpd4p" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.540783 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc75242e-0455-42d2-9539-105eceed64f5-catalog-content\") pod \"redhat-operators-bpd4p\" (UID: \"fc75242e-0455-42d2-9539-105eceed64f5\") " pod="openshift-marketplace/redhat-operators-bpd4p" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.540829 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc75242e-0455-42d2-9539-105eceed64f5-utilities\") pod \"redhat-operators-bpd4p\" (UID: \"fc75242e-0455-42d2-9539-105eceed64f5\") " pod="openshift-marketplace/redhat-operators-bpd4p" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.559507 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fms52\" (UniqueName: \"kubernetes.io/projected/fc75242e-0455-42d2-9539-105eceed64f5-kube-api-access-fms52\") pod \"redhat-operators-bpd4p\" (UID: \"fc75242e-0455-42d2-9539-105eceed64f5\") " pod="openshift-marketplace/redhat-operators-bpd4p" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.729471 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bpd4p" Jan 21 15:12:31 crc kubenswrapper[4720]: I0121 15:12:31.196061 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bpd4p"] Jan 21 15:12:31 crc kubenswrapper[4720]: I0121 15:12:31.678013 4720 scope.go:117] "RemoveContainer" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" Jan 21 15:12:31 crc kubenswrapper[4720]: E0121 15:12:31.678622 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:12:31 crc kubenswrapper[4720]: I0121 15:12:31.868364 4720 generic.go:334] "Generic (PLEG): container finished" podID="fc75242e-0455-42d2-9539-105eceed64f5" containerID="9ab58b08a3f47b6efb09401b865aec9a08dedba1996df54f7d00b8614a77ca24" exitCode=0 Jan 21 15:12:31 crc kubenswrapper[4720]: I0121 15:12:31.868410 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bpd4p" event={"ID":"fc75242e-0455-42d2-9539-105eceed64f5","Type":"ContainerDied","Data":"9ab58b08a3f47b6efb09401b865aec9a08dedba1996df54f7d00b8614a77ca24"} Jan 21 15:12:31 crc kubenswrapper[4720]: I0121 15:12:31.868440 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bpd4p" event={"ID":"fc75242e-0455-42d2-9539-105eceed64f5","Type":"ContainerStarted","Data":"74fdee2e02ce3e51cc54b0243c921199f45d69bbbd52626ce7e362c84a2ea09a"} Jan 21 15:12:31 crc kubenswrapper[4720]: I0121 15:12:31.870885 4720 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 15:12:32 crc kubenswrapper[4720]: I0121 15:12:32.877520 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bpd4p" event={"ID":"fc75242e-0455-42d2-9539-105eceed64f5","Type":"ContainerStarted","Data":"5d5dc767525f0e62f24c9c633361fd142da3afae162ff46fa8725d2777ad4c6a"} Jan 21 15:12:36 crc kubenswrapper[4720]: I0121 15:12:36.924112 4720 generic.go:334] "Generic (PLEG): container finished" podID="fc75242e-0455-42d2-9539-105eceed64f5" containerID="5d5dc767525f0e62f24c9c633361fd142da3afae162ff46fa8725d2777ad4c6a" exitCode=0 Jan 21 15:12:36 crc kubenswrapper[4720]: I0121 15:12:36.924211 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bpd4p" event={"ID":"fc75242e-0455-42d2-9539-105eceed64f5","Type":"ContainerDied","Data":"5d5dc767525f0e62f24c9c633361fd142da3afae162ff46fa8725d2777ad4c6a"} Jan 21 15:12:38 crc kubenswrapper[4720]: I0121 15:12:38.941403 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bpd4p" event={"ID":"fc75242e-0455-42d2-9539-105eceed64f5","Type":"ContainerStarted","Data":"fd117cdc0f437e9efd74b15021035f7c75e9817bd437b54fb1d3b690a71f35a3"} Jan 21 15:12:38 crc kubenswrapper[4720]: I0121 15:12:38.966724 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bpd4p" podStartSLOduration=2.598890247 podStartE2EDuration="8.966643425s" podCreationTimestamp="2026-01-21 15:12:30 +0000 UTC" firstStartedPulling="2026-01-21 15:12:31.870579469 +0000 UTC m=+2589.779319401" lastFinishedPulling="2026-01-21 15:12:38.238332647 +0000 UTC m=+2596.147072579" observedRunningTime="2026-01-21 15:12:38.964046425 +0000 UTC m=+2596.872786397" watchObservedRunningTime="2026-01-21 15:12:38.966643425 +0000 UTC m=+2596.875383357" Jan 21 15:12:40 crc kubenswrapper[4720]: I0121 15:12:40.730360 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bpd4p" Jan 21 15:12:40 crc kubenswrapper[4720]: I0121 15:12:40.730750 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bpd4p" Jan 21 15:12:41 crc kubenswrapper[4720]: I0121 15:12:41.776052 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bpd4p" podUID="fc75242e-0455-42d2-9539-105eceed64f5" containerName="registry-server" probeResult="failure" output=< Jan 21 15:12:41 crc kubenswrapper[4720]: timeout: failed to connect service ":50051" within 1s Jan 21 15:12:41 crc kubenswrapper[4720]: > Jan 21 15:12:45 crc kubenswrapper[4720]: I0121 15:12:45.679077 4720 scope.go:117] "RemoveContainer" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" Jan 21 15:12:45 crc kubenswrapper[4720]: E0121 15:12:45.680146 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:12:45 crc kubenswrapper[4720]: I0121 15:12:45.917368 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jxjff"] Jan 21 15:12:45 crc kubenswrapper[4720]: I0121 15:12:45.920293 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jxjff" Jan 21 15:12:45 crc kubenswrapper[4720]: I0121 15:12:45.929243 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jxjff"] Jan 21 15:12:46 crc kubenswrapper[4720]: I0121 15:12:46.024072 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-228jp\" (UniqueName: \"kubernetes.io/projected/d48eb729-f085-415e-a0e1-8cf0be5b9547-kube-api-access-228jp\") pod \"redhat-marketplace-jxjff\" (UID: \"d48eb729-f085-415e-a0e1-8cf0be5b9547\") " pod="openshift-marketplace/redhat-marketplace-jxjff" Jan 21 15:12:46 crc kubenswrapper[4720]: I0121 15:12:46.024144 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d48eb729-f085-415e-a0e1-8cf0be5b9547-utilities\") pod \"redhat-marketplace-jxjff\" (UID: \"d48eb729-f085-415e-a0e1-8cf0be5b9547\") " pod="openshift-marketplace/redhat-marketplace-jxjff" Jan 21 15:12:46 crc kubenswrapper[4720]: I0121 15:12:46.024181 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d48eb729-f085-415e-a0e1-8cf0be5b9547-catalog-content\") pod \"redhat-marketplace-jxjff\" (UID: \"d48eb729-f085-415e-a0e1-8cf0be5b9547\") " pod="openshift-marketplace/redhat-marketplace-jxjff" Jan 21 15:12:46 crc kubenswrapper[4720]: I0121 15:12:46.126157 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d48eb729-f085-415e-a0e1-8cf0be5b9547-utilities\") pod \"redhat-marketplace-jxjff\" (UID: \"d48eb729-f085-415e-a0e1-8cf0be5b9547\") " pod="openshift-marketplace/redhat-marketplace-jxjff" Jan 21 15:12:46 crc kubenswrapper[4720]: I0121 15:12:46.126876 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d48eb729-f085-415e-a0e1-8cf0be5b9547-catalog-content\") pod \"redhat-marketplace-jxjff\" (UID: \"d48eb729-f085-415e-a0e1-8cf0be5b9547\") " pod="openshift-marketplace/redhat-marketplace-jxjff" Jan 21 15:12:46 crc kubenswrapper[4720]: I0121 15:12:46.127046 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d48eb729-f085-415e-a0e1-8cf0be5b9547-utilities\") pod \"redhat-marketplace-jxjff\" (UID: \"d48eb729-f085-415e-a0e1-8cf0be5b9547\") " pod="openshift-marketplace/redhat-marketplace-jxjff" Jan 21 15:12:46 crc kubenswrapper[4720]: I0121 15:12:46.127349 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-228jp\" (UniqueName: \"kubernetes.io/projected/d48eb729-f085-415e-a0e1-8cf0be5b9547-kube-api-access-228jp\") pod \"redhat-marketplace-jxjff\" (UID: \"d48eb729-f085-415e-a0e1-8cf0be5b9547\") " pod="openshift-marketplace/redhat-marketplace-jxjff" Jan 21 15:12:46 crc kubenswrapper[4720]: I0121 15:12:46.127409 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d48eb729-f085-415e-a0e1-8cf0be5b9547-catalog-content\") pod \"redhat-marketplace-jxjff\" (UID: \"d48eb729-f085-415e-a0e1-8cf0be5b9547\") " pod="openshift-marketplace/redhat-marketplace-jxjff" Jan 21 15:12:46 crc kubenswrapper[4720]: I0121 15:12:46.145995 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-228jp\" (UniqueName: \"kubernetes.io/projected/d48eb729-f085-415e-a0e1-8cf0be5b9547-kube-api-access-228jp\") pod \"redhat-marketplace-jxjff\" (UID: \"d48eb729-f085-415e-a0e1-8cf0be5b9547\") " pod="openshift-marketplace/redhat-marketplace-jxjff" Jan 21 15:12:46 crc kubenswrapper[4720]: I0121 15:12:46.254005 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jxjff" Jan 21 15:12:46 crc kubenswrapper[4720]: I0121 15:12:46.790773 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jxjff"] Jan 21 15:12:47 crc kubenswrapper[4720]: I0121 15:12:47.003758 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxjff" event={"ID":"d48eb729-f085-415e-a0e1-8cf0be5b9547","Type":"ContainerStarted","Data":"200fe4da97ffd136db4042df204fa1e2c0c459c3f8c3ad527f2845119f519a91"} Jan 21 15:12:47 crc kubenswrapper[4720]: I0121 15:12:47.004319 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxjff" event={"ID":"d48eb729-f085-415e-a0e1-8cf0be5b9547","Type":"ContainerStarted","Data":"1c76c8ddc8c4ad7acb88ec4e3f6aae03bea3b9dd7c84d7a562ccaa3d54ee501e"} Jan 21 15:12:48 crc kubenswrapper[4720]: I0121 15:12:48.010833 4720 generic.go:334] "Generic (PLEG): container finished" podID="d48eb729-f085-415e-a0e1-8cf0be5b9547" containerID="200fe4da97ffd136db4042df204fa1e2c0c459c3f8c3ad527f2845119f519a91" exitCode=0 Jan 21 15:12:48 crc kubenswrapper[4720]: I0121 15:12:48.010885 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxjff" event={"ID":"d48eb729-f085-415e-a0e1-8cf0be5b9547","Type":"ContainerDied","Data":"200fe4da97ffd136db4042df204fa1e2c0c459c3f8c3ad527f2845119f519a91"} Jan 21 15:12:49 crc kubenswrapper[4720]: I0121 15:12:49.024352 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxjff" event={"ID":"d48eb729-f085-415e-a0e1-8cf0be5b9547","Type":"ContainerStarted","Data":"2f366cae8a61374e6f528fec14f6e9a02495f3b65e98eb925e5ccef151cf6a13"} Jan 21 15:12:50 crc kubenswrapper[4720]: I0121 15:12:50.033112 4720 generic.go:334] "Generic (PLEG): container finished" podID="d48eb729-f085-415e-a0e1-8cf0be5b9547" containerID="2f366cae8a61374e6f528fec14f6e9a02495f3b65e98eb925e5ccef151cf6a13" exitCode=0 Jan 21 15:12:50 crc kubenswrapper[4720]: I0121 15:12:50.033157 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxjff" event={"ID":"d48eb729-f085-415e-a0e1-8cf0be5b9547","Type":"ContainerDied","Data":"2f366cae8a61374e6f528fec14f6e9a02495f3b65e98eb925e5ccef151cf6a13"} Jan 21 15:12:50 crc kubenswrapper[4720]: I0121 15:12:50.790853 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bpd4p" Jan 21 15:12:50 crc kubenswrapper[4720]: I0121 15:12:50.834102 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bpd4p" Jan 21 15:12:51 crc kubenswrapper[4720]: I0121 15:12:51.045535 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxjff" event={"ID":"d48eb729-f085-415e-a0e1-8cf0be5b9547","Type":"ContainerStarted","Data":"ca9d32d2f94af5a41983610267d69b9049bbe2c64f0accc233d75d883ffafa02"} Jan 21 15:12:51 crc kubenswrapper[4720]: I0121 15:12:51.067613 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jxjff" podStartSLOduration=3.621321766 podStartE2EDuration="6.067597139s" podCreationTimestamp="2026-01-21 15:12:45 +0000 UTC" firstStartedPulling="2026-01-21 15:12:48.014691074 +0000 UTC m=+2605.923431006" lastFinishedPulling="2026-01-21 15:12:50.460966457 +0000 UTC m=+2608.369706379" observedRunningTime="2026-01-21 15:12:51.064133977 +0000 UTC m=+2608.972873919" watchObservedRunningTime="2026-01-21 15:12:51.067597139 +0000 UTC m=+2608.976337071" Jan 21 15:12:53 crc kubenswrapper[4720]: I0121 15:12:53.083423 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bpd4p"] Jan 21 15:12:53 crc kubenswrapper[4720]: I0121 15:12:53.084029 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bpd4p" podUID="fc75242e-0455-42d2-9539-105eceed64f5" containerName="registry-server" containerID="cri-o://fd117cdc0f437e9efd74b15021035f7c75e9817bd437b54fb1d3b690a71f35a3" gracePeriod=2 Jan 21 15:12:53 crc kubenswrapper[4720]: I0121 15:12:53.537604 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bpd4p" Jan 21 15:12:53 crc kubenswrapper[4720]: I0121 15:12:53.662324 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc75242e-0455-42d2-9539-105eceed64f5-catalog-content\") pod \"fc75242e-0455-42d2-9539-105eceed64f5\" (UID: \"fc75242e-0455-42d2-9539-105eceed64f5\") " Jan 21 15:12:53 crc kubenswrapper[4720]: I0121 15:12:53.662426 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fms52\" (UniqueName: \"kubernetes.io/projected/fc75242e-0455-42d2-9539-105eceed64f5-kube-api-access-fms52\") pod \"fc75242e-0455-42d2-9539-105eceed64f5\" (UID: \"fc75242e-0455-42d2-9539-105eceed64f5\") " Jan 21 15:12:53 crc kubenswrapper[4720]: I0121 15:12:53.662508 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc75242e-0455-42d2-9539-105eceed64f5-utilities\") pod \"fc75242e-0455-42d2-9539-105eceed64f5\" (UID: \"fc75242e-0455-42d2-9539-105eceed64f5\") " Jan 21 15:12:53 crc kubenswrapper[4720]: I0121 15:12:53.663589 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc75242e-0455-42d2-9539-105eceed64f5-utilities" (OuterVolumeSpecName: "utilities") pod "fc75242e-0455-42d2-9539-105eceed64f5" (UID: "fc75242e-0455-42d2-9539-105eceed64f5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:12:53 crc kubenswrapper[4720]: I0121 15:12:53.669855 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc75242e-0455-42d2-9539-105eceed64f5-kube-api-access-fms52" (OuterVolumeSpecName: "kube-api-access-fms52") pod "fc75242e-0455-42d2-9539-105eceed64f5" (UID: "fc75242e-0455-42d2-9539-105eceed64f5"). InnerVolumeSpecName "kube-api-access-fms52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:12:53 crc kubenswrapper[4720]: I0121 15:12:53.764963 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc75242e-0455-42d2-9539-105eceed64f5-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:12:53 crc kubenswrapper[4720]: I0121 15:12:53.765218 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fms52\" (UniqueName: \"kubernetes.io/projected/fc75242e-0455-42d2-9539-105eceed64f5-kube-api-access-fms52\") on node \"crc\" DevicePath \"\"" Jan 21 15:12:53 crc kubenswrapper[4720]: I0121 15:12:53.812070 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc75242e-0455-42d2-9539-105eceed64f5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc75242e-0455-42d2-9539-105eceed64f5" (UID: "fc75242e-0455-42d2-9539-105eceed64f5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:12:53 crc kubenswrapper[4720]: I0121 15:12:53.867879 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc75242e-0455-42d2-9539-105eceed64f5-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:12:54 crc kubenswrapper[4720]: I0121 15:12:54.072966 4720 generic.go:334] "Generic (PLEG): container finished" podID="fc75242e-0455-42d2-9539-105eceed64f5" containerID="fd117cdc0f437e9efd74b15021035f7c75e9817bd437b54fb1d3b690a71f35a3" exitCode=0 Jan 21 15:12:54 crc kubenswrapper[4720]: I0121 15:12:54.073017 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bpd4p" event={"ID":"fc75242e-0455-42d2-9539-105eceed64f5","Type":"ContainerDied","Data":"fd117cdc0f437e9efd74b15021035f7c75e9817bd437b54fb1d3b690a71f35a3"} Jan 21 15:12:54 crc kubenswrapper[4720]: I0121 15:12:54.073048 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bpd4p" event={"ID":"fc75242e-0455-42d2-9539-105eceed64f5","Type":"ContainerDied","Data":"74fdee2e02ce3e51cc54b0243c921199f45d69bbbd52626ce7e362c84a2ea09a"} Jan 21 15:12:54 crc kubenswrapper[4720]: I0121 15:12:54.073068 4720 scope.go:117] "RemoveContainer" containerID="fd117cdc0f437e9efd74b15021035f7c75e9817bd437b54fb1d3b690a71f35a3" Jan 21 15:12:54 crc kubenswrapper[4720]: I0121 15:12:54.073440 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bpd4p" Jan 21 15:12:54 crc kubenswrapper[4720]: I0121 15:12:54.094461 4720 scope.go:117] "RemoveContainer" containerID="5d5dc767525f0e62f24c9c633361fd142da3afae162ff46fa8725d2777ad4c6a" Jan 21 15:12:54 crc kubenswrapper[4720]: I0121 15:12:54.116135 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bpd4p"] Jan 21 15:12:54 crc kubenswrapper[4720]: I0121 15:12:54.126827 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bpd4p"] Jan 21 15:12:54 crc kubenswrapper[4720]: I0121 15:12:54.131762 4720 scope.go:117] "RemoveContainer" containerID="9ab58b08a3f47b6efb09401b865aec9a08dedba1996df54f7d00b8614a77ca24" Jan 21 15:12:54 crc kubenswrapper[4720]: I0121 15:12:54.153609 4720 scope.go:117] "RemoveContainer" containerID="fd117cdc0f437e9efd74b15021035f7c75e9817bd437b54fb1d3b690a71f35a3" Jan 21 15:12:54 crc kubenswrapper[4720]: E0121 15:12:54.154137 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd117cdc0f437e9efd74b15021035f7c75e9817bd437b54fb1d3b690a71f35a3\": container with ID starting with fd117cdc0f437e9efd74b15021035f7c75e9817bd437b54fb1d3b690a71f35a3 not found: ID does not exist" containerID="fd117cdc0f437e9efd74b15021035f7c75e9817bd437b54fb1d3b690a71f35a3" Jan 21 15:12:54 crc kubenswrapper[4720]: I0121 15:12:54.154173 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd117cdc0f437e9efd74b15021035f7c75e9817bd437b54fb1d3b690a71f35a3"} err="failed to get container status \"fd117cdc0f437e9efd74b15021035f7c75e9817bd437b54fb1d3b690a71f35a3\": rpc error: code = NotFound desc = could not find container \"fd117cdc0f437e9efd74b15021035f7c75e9817bd437b54fb1d3b690a71f35a3\": container with ID starting with fd117cdc0f437e9efd74b15021035f7c75e9817bd437b54fb1d3b690a71f35a3 not found: ID does not exist" Jan 21 15:12:54 crc kubenswrapper[4720]: I0121 15:12:54.154197 4720 scope.go:117] "RemoveContainer" containerID="5d5dc767525f0e62f24c9c633361fd142da3afae162ff46fa8725d2777ad4c6a" Jan 21 15:12:54 crc kubenswrapper[4720]: E0121 15:12:54.154623 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d5dc767525f0e62f24c9c633361fd142da3afae162ff46fa8725d2777ad4c6a\": container with ID starting with 5d5dc767525f0e62f24c9c633361fd142da3afae162ff46fa8725d2777ad4c6a not found: ID does not exist" containerID="5d5dc767525f0e62f24c9c633361fd142da3afae162ff46fa8725d2777ad4c6a" Jan 21 15:12:54 crc kubenswrapper[4720]: I0121 15:12:54.154695 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d5dc767525f0e62f24c9c633361fd142da3afae162ff46fa8725d2777ad4c6a"} err="failed to get container status \"5d5dc767525f0e62f24c9c633361fd142da3afae162ff46fa8725d2777ad4c6a\": rpc error: code = NotFound desc = could not find container \"5d5dc767525f0e62f24c9c633361fd142da3afae162ff46fa8725d2777ad4c6a\": container with ID starting with 5d5dc767525f0e62f24c9c633361fd142da3afae162ff46fa8725d2777ad4c6a not found: ID does not exist" Jan 21 15:12:54 crc kubenswrapper[4720]: I0121 15:12:54.154723 4720 scope.go:117] "RemoveContainer" containerID="9ab58b08a3f47b6efb09401b865aec9a08dedba1996df54f7d00b8614a77ca24" Jan 21 15:12:54 crc kubenswrapper[4720]: E0121 15:12:54.155027 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ab58b08a3f47b6efb09401b865aec9a08dedba1996df54f7d00b8614a77ca24\": container with ID starting with 9ab58b08a3f47b6efb09401b865aec9a08dedba1996df54f7d00b8614a77ca24 not found: ID does not exist" containerID="9ab58b08a3f47b6efb09401b865aec9a08dedba1996df54f7d00b8614a77ca24" Jan 21 15:12:54 crc kubenswrapper[4720]: I0121 15:12:54.155049 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ab58b08a3f47b6efb09401b865aec9a08dedba1996df54f7d00b8614a77ca24"} err="failed to get container status \"9ab58b08a3f47b6efb09401b865aec9a08dedba1996df54f7d00b8614a77ca24\": rpc error: code = NotFound desc = could not find container \"9ab58b08a3f47b6efb09401b865aec9a08dedba1996df54f7d00b8614a77ca24\": container with ID starting with 9ab58b08a3f47b6efb09401b865aec9a08dedba1996df54f7d00b8614a77ca24 not found: ID does not exist" Jan 21 15:12:54 crc kubenswrapper[4720]: I0121 15:12:54.690843 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc75242e-0455-42d2-9539-105eceed64f5" path="/var/lib/kubelet/pods/fc75242e-0455-42d2-9539-105eceed64f5/volumes" Jan 21 15:12:56 crc kubenswrapper[4720]: I0121 15:12:56.255034 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jxjff" Jan 21 15:12:56 crc kubenswrapper[4720]: I0121 15:12:56.255089 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jxjff" Jan 21 15:12:56 crc kubenswrapper[4720]: I0121 15:12:56.317050 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jxjff" Jan 21 15:12:57 crc kubenswrapper[4720]: I0121 15:12:57.166405 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jxjff" Jan 21 15:12:57 crc kubenswrapper[4720]: I0121 15:12:57.483051 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jxjff"] Jan 21 15:12:58 crc kubenswrapper[4720]: I0121 15:12:58.679178 4720 scope.go:117] "RemoveContainer" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" Jan 21 15:12:58 crc kubenswrapper[4720]: E0121 15:12:58.679979 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:12:59 crc kubenswrapper[4720]: I0121 15:12:59.115488 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jxjff" podUID="d48eb729-f085-415e-a0e1-8cf0be5b9547" containerName="registry-server" containerID="cri-o://ca9d32d2f94af5a41983610267d69b9049bbe2c64f0accc233d75d883ffafa02" gracePeriod=2 Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.048820 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jxjff" Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.124192 4720 generic.go:334] "Generic (PLEG): container finished" podID="d48eb729-f085-415e-a0e1-8cf0be5b9547" containerID="ca9d32d2f94af5a41983610267d69b9049bbe2c64f0accc233d75d883ffafa02" exitCode=0 Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.124232 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxjff" event={"ID":"d48eb729-f085-415e-a0e1-8cf0be5b9547","Type":"ContainerDied","Data":"ca9d32d2f94af5a41983610267d69b9049bbe2c64f0accc233d75d883ffafa02"} Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.124256 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxjff" event={"ID":"d48eb729-f085-415e-a0e1-8cf0be5b9547","Type":"ContainerDied","Data":"1c76c8ddc8c4ad7acb88ec4e3f6aae03bea3b9dd7c84d7a562ccaa3d54ee501e"} Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.124271 4720 scope.go:117] "RemoveContainer" containerID="ca9d32d2f94af5a41983610267d69b9049bbe2c64f0accc233d75d883ffafa02" Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.124385 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jxjff" Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.141814 4720 scope.go:117] "RemoveContainer" containerID="2f366cae8a61374e6f528fec14f6e9a02495f3b65e98eb925e5ccef151cf6a13" Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.164061 4720 scope.go:117] "RemoveContainer" containerID="200fe4da97ffd136db4042df204fa1e2c0c459c3f8c3ad527f2845119f519a91" Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.190994 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-228jp\" (UniqueName: \"kubernetes.io/projected/d48eb729-f085-415e-a0e1-8cf0be5b9547-kube-api-access-228jp\") pod \"d48eb729-f085-415e-a0e1-8cf0be5b9547\" (UID: \"d48eb729-f085-415e-a0e1-8cf0be5b9547\") " Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.191113 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d48eb729-f085-415e-a0e1-8cf0be5b9547-utilities\") pod \"d48eb729-f085-415e-a0e1-8cf0be5b9547\" (UID: \"d48eb729-f085-415e-a0e1-8cf0be5b9547\") " Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.191167 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d48eb729-f085-415e-a0e1-8cf0be5b9547-catalog-content\") pod \"d48eb729-f085-415e-a0e1-8cf0be5b9547\" (UID: \"d48eb729-f085-415e-a0e1-8cf0be5b9547\") " Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.195798 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d48eb729-f085-415e-a0e1-8cf0be5b9547-utilities" (OuterVolumeSpecName: "utilities") pod "d48eb729-f085-415e-a0e1-8cf0be5b9547" (UID: "d48eb729-f085-415e-a0e1-8cf0be5b9547"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.197885 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d48eb729-f085-415e-a0e1-8cf0be5b9547-kube-api-access-228jp" (OuterVolumeSpecName: "kube-api-access-228jp") pod "d48eb729-f085-415e-a0e1-8cf0be5b9547" (UID: "d48eb729-f085-415e-a0e1-8cf0be5b9547"). InnerVolumeSpecName "kube-api-access-228jp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.214794 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d48eb729-f085-415e-a0e1-8cf0be5b9547-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d48eb729-f085-415e-a0e1-8cf0be5b9547" (UID: "d48eb729-f085-415e-a0e1-8cf0be5b9547"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.252323 4720 scope.go:117] "RemoveContainer" containerID="ca9d32d2f94af5a41983610267d69b9049bbe2c64f0accc233d75d883ffafa02" Jan 21 15:13:00 crc kubenswrapper[4720]: E0121 15:13:00.252859 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca9d32d2f94af5a41983610267d69b9049bbe2c64f0accc233d75d883ffafa02\": container with ID starting with ca9d32d2f94af5a41983610267d69b9049bbe2c64f0accc233d75d883ffafa02 not found: ID does not exist" containerID="ca9d32d2f94af5a41983610267d69b9049bbe2c64f0accc233d75d883ffafa02" Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.252894 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca9d32d2f94af5a41983610267d69b9049bbe2c64f0accc233d75d883ffafa02"} err="failed to get container status \"ca9d32d2f94af5a41983610267d69b9049bbe2c64f0accc233d75d883ffafa02\": rpc error: code = NotFound desc = could not find container \"ca9d32d2f94af5a41983610267d69b9049bbe2c64f0accc233d75d883ffafa02\": container with ID starting with ca9d32d2f94af5a41983610267d69b9049bbe2c64f0accc233d75d883ffafa02 not found: ID does not exist" Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.252934 4720 scope.go:117] "RemoveContainer" containerID="2f366cae8a61374e6f528fec14f6e9a02495f3b65e98eb925e5ccef151cf6a13" Jan 21 15:13:00 crc kubenswrapper[4720]: E0121 15:13:00.253288 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f366cae8a61374e6f528fec14f6e9a02495f3b65e98eb925e5ccef151cf6a13\": container with ID starting with 2f366cae8a61374e6f528fec14f6e9a02495f3b65e98eb925e5ccef151cf6a13 not found: ID does not exist" containerID="2f366cae8a61374e6f528fec14f6e9a02495f3b65e98eb925e5ccef151cf6a13" Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.253325 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f366cae8a61374e6f528fec14f6e9a02495f3b65e98eb925e5ccef151cf6a13"} err="failed to get container status \"2f366cae8a61374e6f528fec14f6e9a02495f3b65e98eb925e5ccef151cf6a13\": rpc error: code = NotFound desc = could not find container \"2f366cae8a61374e6f528fec14f6e9a02495f3b65e98eb925e5ccef151cf6a13\": container with ID starting with 2f366cae8a61374e6f528fec14f6e9a02495f3b65e98eb925e5ccef151cf6a13 not found: ID does not exist" Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.253340 4720 scope.go:117] "RemoveContainer" containerID="200fe4da97ffd136db4042df204fa1e2c0c459c3f8c3ad527f2845119f519a91" Jan 21 15:13:00 crc kubenswrapper[4720]: E0121 15:13:00.253802 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"200fe4da97ffd136db4042df204fa1e2c0c459c3f8c3ad527f2845119f519a91\": container with ID starting with 200fe4da97ffd136db4042df204fa1e2c0c459c3f8c3ad527f2845119f519a91 not found: ID does not exist" containerID="200fe4da97ffd136db4042df204fa1e2c0c459c3f8c3ad527f2845119f519a91" Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.253825 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"200fe4da97ffd136db4042df204fa1e2c0c459c3f8c3ad527f2845119f519a91"} err="failed to get container status \"200fe4da97ffd136db4042df204fa1e2c0c459c3f8c3ad527f2845119f519a91\": rpc error: code = NotFound desc = could not find container \"200fe4da97ffd136db4042df204fa1e2c0c459c3f8c3ad527f2845119f519a91\": container with ID starting with 200fe4da97ffd136db4042df204fa1e2c0c459c3f8c3ad527f2845119f519a91 not found: ID does not exist" Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.294509 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-228jp\" (UniqueName: \"kubernetes.io/projected/d48eb729-f085-415e-a0e1-8cf0be5b9547-kube-api-access-228jp\") on node \"crc\" DevicePath \"\"" Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.294551 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d48eb729-f085-415e-a0e1-8cf0be5b9547-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.294561 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d48eb729-f085-415e-a0e1-8cf0be5b9547-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.457268 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jxjff"] Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.463961 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jxjff"] Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.689817 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d48eb729-f085-415e-a0e1-8cf0be5b9547" path="/var/lib/kubelet/pods/d48eb729-f085-415e-a0e1-8cf0be5b9547/volumes" Jan 21 15:13:12 crc kubenswrapper[4720]: I0121 15:13:12.686413 4720 scope.go:117] "RemoveContainer" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" Jan 21 15:13:12 crc kubenswrapper[4720]: E0121 15:13:12.687083 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:13:25 crc kubenswrapper[4720]: I0121 15:13:25.678070 4720 scope.go:117] "RemoveContainer" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" Jan 21 15:13:25 crc kubenswrapper[4720]: E0121 15:13:25.678678 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:13:37 crc kubenswrapper[4720]: I0121 15:13:37.678971 4720 scope.go:117] "RemoveContainer" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" Jan 21 15:13:37 crc kubenswrapper[4720]: E0121 15:13:37.679914 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:13:39 crc kubenswrapper[4720]: I0121 15:13:39.805959 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vnjjz"] Jan 21 15:13:39 crc kubenswrapper[4720]: E0121 15:13:39.806467 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d48eb729-f085-415e-a0e1-8cf0be5b9547" containerName="registry-server" Jan 21 15:13:39 crc kubenswrapper[4720]: I0121 15:13:39.806486 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="d48eb729-f085-415e-a0e1-8cf0be5b9547" containerName="registry-server" Jan 21 15:13:39 crc kubenswrapper[4720]: E0121 15:13:39.806509 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d48eb729-f085-415e-a0e1-8cf0be5b9547" containerName="extract-utilities" Jan 21 15:13:39 crc kubenswrapper[4720]: I0121 15:13:39.806520 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="d48eb729-f085-415e-a0e1-8cf0be5b9547" containerName="extract-utilities" Jan 21 15:13:39 crc kubenswrapper[4720]: E0121 15:13:39.806532 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc75242e-0455-42d2-9539-105eceed64f5" containerName="extract-utilities" Jan 21 15:13:39 crc kubenswrapper[4720]: I0121 15:13:39.806543 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc75242e-0455-42d2-9539-105eceed64f5" containerName="extract-utilities" Jan 21 15:13:39 crc kubenswrapper[4720]: E0121 15:13:39.806573 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d48eb729-f085-415e-a0e1-8cf0be5b9547" containerName="extract-content" Jan 21 15:13:39 crc kubenswrapper[4720]: I0121 15:13:39.806585 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="d48eb729-f085-415e-a0e1-8cf0be5b9547" containerName="extract-content" Jan 21 15:13:39 crc kubenswrapper[4720]: E0121 15:13:39.806605 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc75242e-0455-42d2-9539-105eceed64f5" containerName="extract-content" Jan 21 15:13:39 crc kubenswrapper[4720]: I0121 15:13:39.806616 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc75242e-0455-42d2-9539-105eceed64f5" containerName="extract-content" Jan 21 15:13:39 crc kubenswrapper[4720]: E0121 15:13:39.806635 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc75242e-0455-42d2-9539-105eceed64f5" containerName="registry-server" Jan 21 15:13:39 crc kubenswrapper[4720]: I0121 15:13:39.806647 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc75242e-0455-42d2-9539-105eceed64f5" containerName="registry-server" Jan 21 15:13:39 crc kubenswrapper[4720]: I0121 15:13:39.806998 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc75242e-0455-42d2-9539-105eceed64f5" containerName="registry-server" Jan 21 15:13:39 crc kubenswrapper[4720]: I0121 15:13:39.807023 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="d48eb729-f085-415e-a0e1-8cf0be5b9547" containerName="registry-server" Jan 21 15:13:39 crc kubenswrapper[4720]: I0121 15:13:39.808960 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vnjjz" Jan 21 15:13:39 crc kubenswrapper[4720]: I0121 15:13:39.817774 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vnjjz"] Jan 21 15:13:39 crc kubenswrapper[4720]: I0121 15:13:39.920452 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt7k5\" (UniqueName: \"kubernetes.io/projected/c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3-kube-api-access-xt7k5\") pod \"certified-operators-vnjjz\" (UID: \"c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3\") " pod="openshift-marketplace/certified-operators-vnjjz" Jan 21 15:13:39 crc kubenswrapper[4720]: I0121 15:13:39.920920 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3-catalog-content\") pod \"certified-operators-vnjjz\" (UID: \"c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3\") " pod="openshift-marketplace/certified-operators-vnjjz" Jan 21 15:13:39 crc kubenswrapper[4720]: I0121 15:13:39.921045 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3-utilities\") pod \"certified-operators-vnjjz\" (UID: \"c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3\") " pod="openshift-marketplace/certified-operators-vnjjz" Jan 21 15:13:40 crc kubenswrapper[4720]: I0121 15:13:40.022983 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3-utilities\") pod \"certified-operators-vnjjz\" (UID: \"c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3\") " pod="openshift-marketplace/certified-operators-vnjjz" Jan 21 15:13:40 crc kubenswrapper[4720]: I0121 15:13:40.023701 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt7k5\" (UniqueName: \"kubernetes.io/projected/c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3-kube-api-access-xt7k5\") pod \"certified-operators-vnjjz\" (UID: \"c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3\") " pod="openshift-marketplace/certified-operators-vnjjz" Jan 21 15:13:40 crc kubenswrapper[4720]: I0121 15:13:40.024087 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3-catalog-content\") pod \"certified-operators-vnjjz\" (UID: \"c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3\") " pod="openshift-marketplace/certified-operators-vnjjz" Jan 21 15:13:40 crc kubenswrapper[4720]: I0121 15:13:40.024428 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3-catalog-content\") pod \"certified-operators-vnjjz\" (UID: \"c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3\") " pod="openshift-marketplace/certified-operators-vnjjz" Jan 21 15:13:40 crc kubenswrapper[4720]: I0121 15:13:40.023621 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3-utilities\") pod \"certified-operators-vnjjz\" (UID: \"c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3\") " pod="openshift-marketplace/certified-operators-vnjjz" Jan 21 15:13:40 crc kubenswrapper[4720]: I0121 15:13:40.058279 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt7k5\" (UniqueName: \"kubernetes.io/projected/c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3-kube-api-access-xt7k5\") pod \"certified-operators-vnjjz\" (UID: \"c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3\") " pod="openshift-marketplace/certified-operators-vnjjz" Jan 21 15:13:40 crc kubenswrapper[4720]: I0121 15:13:40.140900 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vnjjz" Jan 21 15:13:40 crc kubenswrapper[4720]: I0121 15:13:40.666541 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vnjjz"] Jan 21 15:13:41 crc kubenswrapper[4720]: I0121 15:13:41.449512 4720 generic.go:334] "Generic (PLEG): container finished" podID="c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3" containerID="624cbc59330c55d150276766c1760e430d48c8e6cfc581498667d6c1bad0d164" exitCode=0 Jan 21 15:13:41 crc kubenswrapper[4720]: I0121 15:13:41.449568 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnjjz" event={"ID":"c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3","Type":"ContainerDied","Data":"624cbc59330c55d150276766c1760e430d48c8e6cfc581498667d6c1bad0d164"} Jan 21 15:13:41 crc kubenswrapper[4720]: I0121 15:13:41.449764 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnjjz" event={"ID":"c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3","Type":"ContainerStarted","Data":"fb5b52af19746d5479416778b0e6bc369f411deab7a84f12ba69f50850528450"} var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515134166457024461 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015134166460017370 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015134160650016506 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015134160651015457 5ustar corecore